As cloud services become more popular, we’re seeing lots of businesses come to us looking for a way to take their old on-premises systems and move them to the cloud.
As data migration becomes a more common requirement for our customers, we’ve decided to share our way of handling this activity. Our approach is repeatable and ensures no unexpected data loss, leading to minimal interruption and reduced business risk for our customers.
Why is data migration becoming increasingly important?
We’re seeing more and more customers moving past the old on-premises approach to apps and data. Many of them are making the leap to cloud-based services, and that usually requires at least some data migration.
Take a common scenario, for example: many businesses have used (or perhaps endured!) on-premise CRMs such as Siebel for many years but have finally decided to move to a cloud system, with Salesforce and Dynamics 365 being the most popular choices. Such a move is reliant on getting on-premises data into the new cloud system.
To succeed with such data migrations, we take the following approach:
- We create a mirror (or take a data dump) of the customer’s existing data so we can work with it. This means we don’t need to depend on the customer’s live data.
- We create a mirror of the destination system.
- With two mirror systems in place, we create scripts and analyse the data. This lets us extract, transform and clean the data that will move between the two mirrors. The script development takes a couple of weeks.
- We run our scripts on loop several times to allow us to spot and correct any errors. This can sometimes lead to us discovering where compromises need to be made to ensure a successful data migration. The key point is that there’s no chance of unexpected data loss for our customers: any compromise will be a planned one.
- Once we’re happy that our scripts successfully complete their loops, a further script loads the data into the new system. At this stage, we can be totally confident that what has been tested between the two mirror systems will be identically replicated for the live data migration. Before pressing the button, we refresh the mirror with the latest live data.
The mirroring approach means that clients’ live systems tend to be out of action for only 1–2 hours at most and this can sometimes be scheduled for times when the downtime will cause the least disruption to the business.

How we improve our data migration process
As each successful data migration completes, we keep track of the different systems we’ve migrated from and to. This means that we already know what to do for other customers in future migrations.
If a new customer comes along with the same destination requirement, for example, we can do this easily because the process relies on only the customer mirror data being adjusted.
Everything in the above process is scientific because it is all testable, reliable and repeatable. There’s no “it should probably work” guesswork involved here: this is a robust, professional approach to ensuring successful data migrations. Those migrations become ever smoother as we catalogue the data migrations of more and more customers.
And while we’ve mentioned a typical data migration case of a move between CRMs, it’s important to remember that our approach is set up to support a move of any on-premises system to the cloud.
Sometimes, we make minor tweaks to optimise for a customer’s needs. For example, in PropTech, we’re starting to see more migrations from old housing systems such as Orchard and Northgate to new ones such as Rubixx. The trend for SaaS there means that approach has to alter slightly. Instead of going database to database, we have to load into APIs, so it’s important to use the right tooling.
But whether we’re dealing with CRMs, housing systems or any other on-premises system that needs to move to the cloud, we’re confident of moving our customers’ data reliably with no unexpected data loss.