Archive job design suggestion
Posted: Sun May 10, 2009 9:56 pm
hi,
I have a series of server jobs
DRS stage ------ Transformer---- Sequential file
the number would be 33 similar jobs like above with different tables.
These job are being called using set of sequencer jobs.
The requirement is to archive old data
Each tables holding approx a million of rows to archive, After job subroutine function ExecDos is used in job parameters section of each job to create a zip file in a specified directory with the data.
My concern with this design is:
a. Does scheduling more than 15 jobs in a single canvas of server is feasible or what could be implications.
b. What are the limitations of handling millions of records and moving between directories using datastage, in term of log issues.
Any suggestion is welcomed, i am not content with this design.
thanks,
I have a series of server jobs
DRS stage ------ Transformer---- Sequential file
the number would be 33 similar jobs like above with different tables.
These job are being called using set of sequencer jobs.
The requirement is to archive old data
Each tables holding approx a million of rows to archive, After job subroutine function ExecDos is used in job parameters section of each job to create a zip file in a specified directory with the data.
My concern with this design is:
a. Does scheduling more than 15 jobs in a single canvas of server is feasible or what could be implications.
b. What are the limitations of handling millions of records and moving between directories using datastage, in term of log issues.
Any suggestion is welcomed, i am not content with this design.
thanks,