Page 1 of 1

22 Million Records from File to OracleEnterprizeStage

Posted: Fri Nov 14, 2008 3:37 am
by skp
Hi all,

I want to load 22 million records from a sequential file to a oracle table.
I design a job as

SeqentialFile ----->OracleTable

But when i run the job , sequential stage was not able to pick huge file which is more than 4 gig.

Please let me know whether we got any other alternate for this scenario

Regards
Sudha

Posted: Fri Nov 14, 2008 3:41 am
by mdbatra
1. Divide the input file into 2 parts
2. Try reading them in a single Seq file Stage, if works ..good!
otherwise, use 2 seq files to read the data & then funnel them.

Posted: Fri Nov 14, 2008 4:56 am
by Romy
Hi,

You have mentioned that you are not able to view the file using the sequentia stage itself.Better try to divide the file into smaller size and try running the job...

Posted: Fri Nov 14, 2008 5:17 am
by skp
Thanks a lot for your responce,

1)It was told that records will increase by 10% every year on 22 millions.accumalatively.
2)If I want to divide the file into smaller files how would i do it in the datastage.

The other thought is
a)Dividing the file into small files by using a unix script
b)There by using a folder stage pass all the files into the datastage job.

can any one help me on this thought.

Posted: Fri Nov 14, 2008 5:33 am
by mdbatra
What's the frequency of the input feed? + i doubt if there is a Folder stage in parallel.

Posted: Fri Nov 14, 2008 6:02 am
by rajngt
Based on your design it seems that you are doing only insert, in that case go for Oracle Load option, instead of DataStage.

Posted: Fri Nov 14, 2008 8:21 am
by swapnilverma
Use FileSet instead of sequential file stage.

Hope that will help