Page 1 of 1

Delta Run(Partial Run)

Posted: Wed Feb 09, 2011 11:26 pm
by vitumati
Hello Friends.

I have 100000(1 Laks)data in my source.
I need ti design i job like first I need load 50000 Records to target in 1st Run.
Similarly 2nd Run I need to load only remaining 50000 records.
Could you please help me this issue.


Thanks
-----------
Vijay.

Posted: Thu Feb 10, 2011 1:38 am
by kkalyanrao@gmail.com
Hi,

One option would be to create two copies of the job and using a transformer stage in both jobs , put a contstraint to fetch first 50k records (@INROWNUM < 50K) in the 1st job and fetch rest in the second job(@INROWNUM > 50K).

Posted: Thu Feb 10, 2011 1:57 am
by priyadharsini
The constraint should be based on the number of nodes the job is running.

Posted: Thu Feb 10, 2011 3:54 am
by ray.wurlod
Dare we ask why these runs have to be consecutive?

If you allocate two readers per node to the Sequential File stage (and I'm making a big assumption about the source there) you will have the division that you desire. They will execute concurrently, but load into the target on what are effectively separate connections.