Translating ancient code for the generative AI era

Having easy access to data is critically important in today’s era of generative AI. For many organisations, this is not yet the case, as their legacy code hampers them from developing the advanced AI systems that they need to propel their business forward. Legacy, or out-of-date, code can have significant performance issues and scalability limitations, not to mention huge maintenance costs which can eat away at IT budgets.

Yet, there is often a reluctance to migrate away from this legacy code due to the inherent risk and time implications in the process, as well as the overall upheaval to team members who are set in their ways. So, how can organisations turn this often-dreaded task of migration into a desirable opportunity for improvement and growth? Increasingly the answer lies in generative AI.

Translations can be tricky

Organisations are increasingly recognising the need for migrating away from legacy code. Migration is the process of moving data from one location or application to another, typically to replace or augment legacy systems with new applications that will share the same dataset. In today’s context, migrations are often driven by a business taking the decision to move from traditional on-premises infrastructure to the cloud. This digital transformation is designed to make an organisation more agile and better equipped to deal with the ever-changing technological landscape.

Despite a significant appetite for change, challenges can slow down the migration process. For instance, many organisations worry about the security implications associated with migrations. Think of when you have moved some of your favourite photos or music from an external hard drive to another, but on a vast scale. There are risks around data loss and duplication that can arise in the transportation process, which could leave data vulnerable, inaccurate or lost altogether. But also concerns about the permissions and policies associated with data, particularly unstructured, that could leave data at risk.

Additionally, and arguably most difficult to overcome, is the vast investment in time, skills and, of course, money that is associated with migrations. Migrations require a lot of time-consuming and arduous tasks, diving into minute detail of things like code, databases, security, governance and more. To do this properly involves a lot of time, and needs an organisation to have very specific and sought-after skills on hand. To compound this, legacy data is often unstructured and written in a variety of coding languages, requiring unique skills to translate legacy code into a standardised format and modern code language that runs efficiently on a cloud data platform. Equally, today’s analysts or data scientists may not have the skills to understand legacy code, with this lack of consistency making teams not feel confident about moving away from a language they are very familiar with.

Easing the burden with generative AI

Generative AI provides a path to clear the roadblocks associated with migration and make the process more manageable, primarily in its ability to augment, accelerate and streamline the many processes involved. In the early part of the migration process, code preparation and generative AI can accelerate various analyses that can identify usage patterns. These reports can support teams in narrowing down which data is still being used within the organisation, and thus what should be translated versus what can be archived. Additionally, by analysing vast amounts of error logs, generative AI can accelerate data quality assessment between the legacy and new platforms, ensuring that governance is upheld throughout the process. Also, beyond simply identifying and validating data, it can also recommend optimal configurations and strategies for teams to use as they migrate, giving key insights to teams as they work through the process. One key way in which generative AI can achieve all this is by translating code into a natural language that everyone can understand, taking an enormous amount of complexity out of the whole migration effort. Additionally, this step helps to re-evaluate the business rules and logic that have been written as code to make sure it is clean, curated and still fits with the overall business direction.

Through this accelerated augmented analysis, generative AI takes significant pressure off teams who otherwise would have to sift through all this data. Previously, this led to many teams resorting to ‘lift and shift’ efforts where they would simply pick up the code and data and move it to analogues on a public cloud provider. This corner-cutting can lead to a hugely inflated cloud budget, which can cause more hesitation to move over to the cloud, or even in some cases leaving the cloud altogether. Due to the importance of the process and its impact on time and cost, it is crucial for teams to get this right the first time. Generative AI helps to accelerate the process, thus freeing up human oversight to be more analytical instead of being bogged down by the more time-consuming and manual elements involved.

Bolstering the business case

Modernisation will always be a significant undertaking and one that requires a lot of hard work, diligence, and accuracy. However, by accelerating a lot of the heavy lifting involved in migration, and drastically improving efficiency, teams can instead focus on ensuring the process’s effectiveness. Additionally, the translation to natural language goes a long way to mitigating the organisational risks associated with the unknown rules and logic embedded within legacy systems. This approach can convince those who are reluctant to migrate that there is a sufficiently strong business case for leveraging these techniques in the execution, despite the possible challenges in the model accuracy.

Those who modernise will put themselves in the best position to thrive in today’s era of generative AI. All AI systems, particularly those large language models that have exploded in popularity recently, are fully reliant on the data they use. If the data is obsolete, duplicated or inaccurate, the model will be rendered useless. As part of the modernisation process, organisations must embrace a mindset that is set up to thrive long-term. What is built today, will soon be ‘legacy’ due to how quickly the industry is advancing. Therefore having a robust architecture in place that is able to adapt and evolve as the technology does is crucial and modernisation is the foundation upon which this must be built.

+ posts

Dael Williamson is the EMEA CTO at Databricks, the data and AI company, where he supports customers using decades of commercial experience in technology and data strategy. He is also actively involved in the UK and EU start-up community and sits on the advisory board of 2 start-ups. He has experience working in start-ups and enterprises across various industries, including health, Biotech, Fitness, Telecom and media, Retail and CPG, Travel, Automotive, Manufacturing, and Finance.

Unlocking Cloud Secrets and How to Stay Ahead in Tech with James Moore

Newsletter

Related articles

Driving the Future of Connectivity with Data Centres

As an astonishing matter of fact, there are 5.3 billion people...

Willow’s Breakthroughs in Quantum Stability

The Start of a New Era Have you heard the...

Cloud Computing Demands Robust Security Solutions

Modern organisations are increasingly reliant on cloud computing to...

Unlocking the future of manufacturing; AIOps network optimisation

Global manufacturing activity has failed to show signs of...

Why is the hybrid cloud the future of computing?

Have you ever wondered how companies can benefit from...