Simplification of data architectures overcomes the skills shortages exposed by cloud migration

(Image credit: Shutterstock / Blackboard)

Cloud migration has taken off during the pandemic providing many organizations with gains in agility but exposing a significant lack of skillsets when it comes to integrating data to supercharge business value.

The upward trend in cloud usage started almost immediately after the spread of Covid. In the early days of the outbreak in April last year, it was Microsoft who trumpeted at an earnings call that they had seen two years of digital transformation in the space of two months – an early indication of what was to come across industries. 

In the public cloud, the upward trend in adoption is set to continue. Gartner, for example, predicts worldwide spending on public cloud by end-users will increase by 18 percent this year, hitting $305 billion in total. Next year the consultancy predicts the figure will be $362 billion. 

What has happened during the last 15 months is that three trends have combined with high valency. The first is enforced remote working and the second is the need for more effective collaboration and any-time access to data and analytics. The third trend is the growth in number and quality of SaaS applications. This potent combination has made the flexibility of public cloud very attractive to many organizations looking for solutions to overcome short-term challenges and provide longer-term agility. 

Admittedly, cost has remained an important consideration in public cloud migration, enabling organizations to match resources to demand more easily than on-premises, with more flexible pricing and storage options. When working with big data it is possible to achieve major economies of scale without being tied into excessively rigid contracts. Once an organization has achieved a specific project it can scale back its requirements and reduce its costs. 

Organizations want access to a new eco-system of applications 

However, the chief motivation for this rapid expansion into the public cloud is primarily about opening the door to new eco-systems of applications based on a common infrastructure and resources. The three main cloud vendors have created a broad range of services that organizations view as a platform for the development of applications across their whole enterprise. Rather than different business units working separately on their own data strategies, they come together, bridging gaps that would otherwise open up.

This is important because the pressures of Covid exposed gaps in the data of many organizations struggling to function remotely. The private data center brings with it many challenges when organizations suddenly need access to services over the internet. The businesses that overcame these challenges were those able to carry on serving customers without problem. 

Containerisation and AI managed services

Containerisation technology, which packages software code and its dependencies so it runs on any infrastructure, has also accelerated cloud migration. Customer relationship management, resource-planning and sales applications have all been moving into the cloud. Oil and gas and major logistics companies already use public cloud computing and storage capabilities, and banks employ its flexibility and scale for mobile applications. Process automation, the continuance of remote or hybrid models of working by major enterprises along with the increased use of collaboration apps and services such as SASE (Secure Access Service Edge) will all further hasten migration to the cloud. 

The cloud is also able to meet the increased appetite for artificial intelligence and machine learning (ML) enabled services, but on a managed services basis to optimize flexibility, cost-management and provide rapid access to the most effective solutions as they prove their worth.

For many companies cloud is a business priority 

In other words, cloud is firmly established as a strategic priority. Enterprises now have business units looking at data strategy which are no longer part of the core infrastructure team. They examine how organizations store and use data, how it can enrich the services they provide and how businesses become data-driven. 

Yet for all this enthusiasm and understanding of the transformational benefits of cloud migration, many organizations lack the right skills to manage and optimize deployments and to integrate data so it can be used for maximum business value. 

Containerisation has advanced considerably but it is still not easy to migrate many in-house applications into the cloud without extensive and costly re-engineering. These may be business-critical applications that were built for very specific purposes and which rely on operating systems not supported in the cloud. Re-engineering such applications requires a set of skills that most organizations do not have in-house. Network bandwidth also circumscribes what organizations can do, limiting access to certain high-performance, low-latency requirements that require dedicated infrastructure. This is why the hybrid cloud is the future for many enterprises. They will use the cloud for backups and archiving, with APIs into the cloud, retaining critical applications and their more sensitive data in on-premises data centers. This complex hybrid infrastructure becomes challenging when the ebb and flow of demand requires constant change and the development of new applications.

Simplicity is essential to successful cloud deployment 

What has surfaced over the last 15 months is that the organizations that thrive in the cloud will be those keeping their new data infrastructure simple so it can adapt to new requirements and remain easy to manage. Complexity is a built-in disadvantage that only undermines long-term flexibility. 

Far more effective is to adopt a simplified approach that leverages a flexible, cloud-agnostic data platform that also supports on-premises deployments. This gives enterprises the ability to develop, deploy and maintain their solutions in the cloud and in hybrid environments, supporting multiple cloud environments and the ability to change cloud providers easily and without having to rewrite code. Automation takes care of deployment to whichever environment the use case requires. Containerization capabilities within a platform provide the agility and repeatability needed to revolutionize how organizations respond to business and technology needs. Containers are platform-independent, fully portable and easily tuneable.

By making the data fully interoperable between systems and applications, a single platform takes care of the management of the database and analytics. organizations can enjoy the efficiency and agility of the cloud and containerized software without having to undergo major development or retooling. These are specialist areas where they may lack the necessary skills. They can more easily achieve for themselves all the advantages of cloud infrastructure, adopting new software engineering processes to simplify the development, deployment, and maintenance of real-time, data-intensive solutions.

This simplification of architecture and reduction in complexity increases portability and flexibility without sacrificing performance or scalability. organizations should not choose a complex path into the cloud, which only makes it more likely that the moving parts will malfunction. The data platform provides the most straightforward means by which businesses can overcome the skills gap in cloud migration and optimization, maximizing public cloud and hybrid deployments for greater agility, resilience and profitability.

Saurav Gupta, Sales Engineer, InterSystems

Saurav Gupta joined InterSystems in July 2006 as Sales Engineer and has been working across both technology and healthcare solutions business of InterSystems.

Topics