Page 1 of 1

Archive job design suggestion

Posted: Sun May 10, 2009 9:56 pm
by ag_ram
hi,

I have a series of server jobs
DRS stage ------ Transformer---- Sequential file

the number would be 33 similar jobs like above with different tables.

These job are being called using set of sequencer jobs.

The requirement is to archive old data
Each tables holding approx a million of rows to archive, After job subroutine function ExecDos is used in job parameters section of each job to create a zip file in a specified directory with the data.

My concern with this design is:

a. Does scheduling more than 15 jobs in a single canvas of server is feasible or what could be implications.

b. What are the limitations of handling millions of records and moving between directories using datastage, in term of log issues.

Any suggestion is welcomed, i am not content with this design.

thanks,

Posted: Sun May 10, 2009 10:31 pm
by ray.wurlod
(a) It's feasible PROVIDED THAT you hardware supports that many processes. The total load is equivalent to running fifteen simple jobs (one equivalent to each stream) simultaneously. I would prefer individual jobs and either a sequence or job control routine in which I could set an upper limit on the number simultaneously processing.

(b) There are no log issues with very many rows; the stage statistics simply have larger numbers. Moving between directories should not impact the log at all, unless you explicitly log the results of each cd command.