Page 1 of 1

Maximum size of input text file

Posted: Tue Sep 12, 2006 11:25 pm
by evanmaas
Hi,

I have a DS job that read a input txt file of 4 GB and it's running for 15 hours and do nothing.

Can someone tell me if there is a limit or maximum size of a input txt file that DataStage van read?

Regards,

Erik

Posted: Tue Sep 12, 2006 11:35 pm
by sun786
How many records are there in the file?
try to view the data for first 100 reocrds. check for any error.
did you try spliting the file into 1GB ( four files ) then tried to read/load the file.

I can load a file of 3 GB without any issue with 13 Millions records.

Regards
Siraz

Posted: Wed Sep 13, 2006 12:31 am
by m_keerthi2005
I have loaded the 5 GB file without any issues. It completed within 4 hours. When you click the view data it should come within seconds. Please check the options you have selected. It will work.

Posted: Wed Sep 13, 2006 1:19 am
by ray.wurlod
There is no limit on the size of a Sequential File that DataStage can read.

From the other posters' experiences I would suggest that the delay is not in reading the text file, but elsewhere in your job design.

You can prove this easily enough. Construct a job as follows.

Code: Select all

Sequential File  ----->  Transformer  ----->  Sequential File
Put @FALSE as the constraint on the Transformer stage output link and/or append the output to /dev/null. This will show you how fast DataStage can read the large file.

Everything you add from there will increase the total elapsed time.

Posted: Wed Sep 13, 2006 2:15 am
by Prashantoncyber
pls check if there are any warnings in log.

I have experienced that warning has impact on job performance.

Also what file system are u using?

Posted: Wed Sep 13, 2006 7:15 am
by kcbland
Please describe the job design, stages, etc. What if it's not stuck reading the file, but writing to the target? Is there a before/after stage/job routine call that's the true problem? Viewing the data on the Sequential file proves that the definitions are valid.