Page 1 of 1

Large xml file

Posted: Thu Sep 09, 2010 3:27 pm
by satheesh_color
Hi All,

We have created a huge xml file(8 million recs) by using xmloutput stage.The job aborts in the middle of xml extraction.

Kindly let me know how to split the same into 2 or more xml files.





Raj.

Posted: Thu Sep 09, 2010 6:39 pm
by tcj
Is the job aborting when you are trying to read the xml file or while you are creating it?

Tim

Posted: Thu Sep 09, 2010 7:15 pm
by chulett
Assuming the problem is creating it, look into the use of the Trigger Column option, value changes there will trigger a switch to a new output filename.

Posted: Fri Sep 10, 2010 9:55 am
by arunkumarmm
chulett wrote:Assuming the problem is creating it, look into the use of the Trigger Column option, value changes there will trigger a switch to a new output filename. ...
He said there are 8 million records. By enabling this, wont it create as many files?

Posted: Fri Sep 10, 2010 9:59 am
by eostic
It depends entirely on the logic of the job. The "trigger" is just a column on the input link.....what you set it to is up to you.....a series of counters that (for example) lead to a column that is always "1" for the first 500,000 rows, and then "2" for the next 500,000....etc. would be one way to use it.

Ernie

Posted: Fri Sep 10, 2010 10:21 am
by chulett
Exactly - you control when the values changes and it doesn't matter what it is. We typically used a Mod() function to increment a counter every X output records.