Page 1 of 1

Datastage Parallel Xtender

Posted: Thu May 31, 2007 9:24 pm
by bhags
I am reading a source table and writing a dataset.From dataset i am loading the data to target table.Finally i need to check number of rows read from source and number of rows written in target.If the row count does not matches i need to abend the job.Is there any specific function to achieve this.Please advice.

Posted: Thu May 31, 2007 9:28 pm
by ArndW
There is no single function that does this, but with PX you won't need to check, since you can use the reject link on the database stage to ensure that no records are dropped.

Posted: Thu May 31, 2007 10:07 pm
by Maveric
Make sure there is no nullable yes to nullable no field mapping. And put a reject link on the oracle stage to transformer. In the transformer set the output constraint to abort after 1 row.

Re: Datastage Parallel Xtender

Posted: Tue Jun 05, 2007 8:17 am
by Perwezakh
bhags wrote:I am reading a source table and writing a dataset.From dataset i am loading the data to target table.Finally i need to check number of rows read from source and number of rows written in target.If the row count does not matches i need to abend the job.Is there any specific function to achieve this.Please advice.
FIrst of all some one is confusing you in your work. Just ask who is telling you to do this that how can compare the number of records before writting it to target table. You can count the number from source by ROWNUM fuction but to get the row count to target you have apply reject file logic and do the count matching logic in shell scripts. By you can do all these matching after complete run of your DataStage job. You won't be able to stop your DataStage job based on the output of shell. Explain this example to your designer who is telling you to do this........

Re: Datastage Parallel Xtender

Posted: Tue Jun 05, 2007 9:09 am
by VCInDSX
bhags wrote:...If the row count does not matches i need to abend the job.Is there any specific function to achieve this.Please advice.
I think Perwezakh has a good point....
Try to gather all details about the requirements...

When you say "abend", is it just "end" the job.... and may be abort a sequence that might have other jobs in it that are dependant upon the Target table having all the data for further processing?

Will the target table already have records from previous loads...?

What about DB cleanup? if incomplete data has been loaded, would that be rolled-back?

If your design has a temporary staging area where you first update that and perform your gate checks, that might help... but it will add an extra step to your final loading process to copy from the staging area to the target table.

Cheers!!!