Page 1 of 1

Limit the number of records

Posted: Thu Jan 04, 2007 10:42 pm
by vij
Hi,

As i wanted to limit the number of records to be loaded in a job, i used this option - job properties -> exectuion -> force complie

here, i faced a problem, all the limited records which are read by the datastage, are being wriiten to the log and so it takes more time and occupies more server space too...

can anyone pls let me know is there any way i can avoid this or suggest other solution to limit the records to be loaded in the job.

Thanks,

Re: Limit the number of records

Posted: Thu Jan 04, 2007 10:53 pm
by ajith
Not clear what is your exact requirement
Is it to load less number of rows to the target?
in that case head or tail stage would do.
I am not getting what you did in force compile as it never gives any options to limit the rows

sorry that I am not getting your requirement, please clarify

Posted: Thu Jan 04, 2007 10:54 pm
by kumar_s
Force compile to reduce records :roll:
Are you getting confused with DumpSore?
What is your job desing?
While running the job, you will have the limits like 'Stop stages after n rows' you can use that.

Posted: Thu Jan 04, 2007 11:45 pm
by ray.wurlod
There are lots of ways to do this. What does your job design look like? The best way is not to extract rows in the first place. Another possibility is a Head or Tail or Sample stage. What precisely are your requirements?

Posted: Fri Jan 05, 2007 6:44 am
by trobinson
I would ask why the limited records are being written to the log. Could it be there is a Constraint on the link of Otherwise? Perhaps a Peek stage is capturing the rejected records in the Director logs? If it isn't needed, remove it and let the rejected records fall on the floor. Better yet write them to a Sequential file.