Incremental Migration

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
vmcburney
Participant
Posts: 3593
Joined: Thu Jan 23, 2003 5:25 pm
Location: Australia, Melbourne
Contact:

Post by vmcburney »

Updates can be very difficult to find unless you can make a change to your legacy database to tag them. For example on your legacy database add a modified date to each table along with a trigger to populate the field whenever an add or modify occurs. Get the trigger to write the primary key to a delete table for any removed rows.

If you are unable to make any changes to your legacy systems then investigate the possibility of doing full or subset reloads periodically instead of trying to find incremental recordsets. A helpful idea is to add a DBSource field record that indicates which legacy database the row came from so it can be delete and replaced. You are defining which system owns that data so that system can reload it with the latest values.

You may find with a multiprocessor machine and the parallel extender that you can reload your data in an overnight processing window. You can also write just the one set of extracts that perform the original migration and the ongoing updates and save a lot of development time.
Post Reply