Complex OpenUpgrade Migration

Day 5/6: Quants

Its Saturday/Sunday so not too much work today on this as other work to catch up on.  We are still stuck on quants and this is a hard area.  Currently it is reprocessing the done moves.  In 22 hours it has managed somewhere between 50,000 and 80,000.  Its hard to tell as the entire module upgrade happens inside a single SQL transaction so I'm really just monitoring ids in pg_activity.  A check shows postgres really isn't even caring about the workload and toddling along at 10-25 tps and 12% Cpu.  Being charitable we are processing a little under 4,000 per hour.  So between 36 and 48 days to finish.  It isn't going to be a realistic proposition.

After a bit of googling it appears there is a quadratic relationship of sorts in the ORM in the way it populates these fields, so 48 days was probably very very charitable.  Complaints of the slowness and taking hours and days start at volumes around 3% of what we are trying to do.  I decided to try chunking the migration and it did improve things, but only down to an estimated 21 days.  I then attempted to do via SQL, which is currently 19 hours in and no idea of how far to go.  I'll give it 36 hours but it is time to consider alternate approaches.

The OpenUpgrade strategy could be described as an inplace orm migration, taking existing data, snap-shotting it, running it through some processes that changes it with new migrated data, start the server up and users are away on the new system.  In the past when I've done this primarily via SQL the process was much the same, define the current state and the desired end state and develop some tools to process it.  However it ended up in a new database, which means rather than deleting a record if it is no longer required, you simply didn't create it.  It also allowed more complex scenarios like taking separate databases and migrating them into one, or even separate parts of different databases. But fundamentally, it was a snapshot of an existing database at a point in time, although it had some support for incremental loading.

The other option that I had used in the past for a relatively straightforward but large migration was Pentaho Data Integration ETL tool.  Using this would mean it would be possible to create a streaming migration that continually updates as new records are entered, greatly extending the migration window.  The final option is a fresh start and then everything in between.

The issue with all these options is that the code migration needs completion first in order to get the desired end state.

Complex OpenUpgrade Migration
Day 4: Fix some data