Moving to the cloud is a great opportunity for application modernization. Database applications, in particular, stand out. Not only are they some of the most critical applications, they are also the ones hardest to modernize. When migrating to a cloud database, dependent applications typically require significant modifications. Often so fundamental that a complete rewrite appears as the only solution.
IT leaders find themselves in a challenging position. Adopting a cloud-native database is clearly superior to moving a legacy database into the cloud. However, rewriting large numbers of applications as part of the migration is a daunting task. It will disrupt the business and ensnare IT in a costly multi-year migration project. Add to that the low rate of success of database migrations and it becomes clear why IT is hesitant to commit.
What are IT leaders to make of the situation? In the end, the lion share of a migration project is not actually about modernization. It is about making existing applications work with a new database. This situation is about to change. Technology advances in the form of database system virtualization offer a solution that optimizes business objectives: adopt new database technology rapidly without sacrificing long-standing investments in app development.
This won’t be the last database
IT leaders often approach a database migration as if it were the last migration — ever. Much of that is wishful thinking. Because migrations are so costly and time-consuming, they treat it as a single choice during their tenure. This puts extra pressure on the decision. The room for mistakes narrows quickly if you have only one shot at the problem during your tenure.
However, even within the ecosystem of a single cloud, chances are the business will need to move between data systems. Some workloads go to the cloud data warehouse, others to operational systems. Over time they may move back and forth.
In addition, any new database technology tomorrow may emerge as a strong competitive advantage. If only the enterprises can adopt it quickly. Today, every major enterprise operates a large variety of databases. Every such database is reflective of the state of the art at its time of introduction. Every one that is still operating despite being outdated is a reminder that a part of the enterprise is stuck in the past.
Being agile in adoption of database systems is more critical than ever.
If it isn’t broken, don’t fix it
Ask any IT leader how the last database migration went, and you will get similar responses. They are viewed as a necessary evil. Deep down, they keep wondering if the money spent on a migration would have been better deployed toward generating new revenue. Over 90% of applications are typically rewritten just to make them work with the new database. No new functionality is actually created.
Moreover, applications are not rebuilt with the same functionality. Frequently, the applications are compromised in the process to avoid delays.
Today’s conventional migration approaches perpetuate this pattern. Oftentimes, a complete rewrite is prescribed as a solution to phase out legacy databases. Ironically, the cost of migration has come to be viewed as the tax of adopting new technology.
What if instead of migrating, you could preserve most applications and focus on the few that could truly benefit from modernization? The good news is database system virtualization enables just that.
Database system virtualization unlocks the best of both worlds
CIOs have to navigate a challenging situation. Any number of external factors may demand swift action: an upcoming timeline for vacating the data center, the requirements for high-availability, or the fact that a legacy vendor charges what feels like extortionary fees. Even those who moved to a 3rd party database in the cloud often look to go cloud-native.
Database system virtualization offers a powerful alternative. By extending the principles of virtualization to the database stack, database system virtualization decouples applications and databases. With database system virtualization, applications written for a legacy system run seamlessly on a cloud-native database system. This includes cloud data warehouses but also OLTP databases.
With database system virtualization, enterprises are unlocking the best of both worlds. (1) Long-standing investments in business-critical applications are preserved. Applications get replatformed to the new destination rapidly, without a re-write and eliminating the business disruptions. (2) New applications can be developed immediately on the new cloud database platform.
IT leaders have a full plate. Generating new revenue and outperforming the competition is not accomplished by rewriting perfectly good applications. Instead, adopting a new database platform like a cloud database can be a perfect launch pad for the new initiatives.
By leveraging database system virtualization, old and new applications can exist side-by-side independent of which database they were originally developed for. Shared data and shared infrastructure bring about critical efficiencies. Best of all, the massive cost savings of avoiding a manual migration can be deployed directly to building new applications and generating new revenue.
Database system virtualization may be still a new technology, but it is already giving leading enterprises an edge. It is backed by some of the biggest names in data technology and all cloud providers. IT leaders will be pleased to know draconian measures like manual or tool-aided database migrations will very soon be a thing of the past.
About the author: Mike Waas is the founder and CEO of Datometry, a SaaS database virtualization platform enabling existing applications to run natively on modern cloud data management systems without being rewritten. He has held senior engineering positions at Microsoft, Amazon, EMC, and Pivotal and is the architect of Greenplum’s ORCA query optimizer. He has 40+ scientific peer-reviewed publications and holds 20+ patents.