Hi,
When I write the data to Sequential file ,it runs without aborting. But when I try to write that to XML Output,the job aborts with error message no enough memory. The total record count is 1822963. So I want to split and write to XML output stage?CFfStage->Transformer->Xml Output(2). I think we can do it with limiting the row count to 1st(1-911481 records) & 2 nd(911482-1822963) file. I think we can do it with INROWNUM but I have not used that & don't know the sytax.
Can anyone help on how to proceed with this?
Thanks in Advance !!!
XML Output- Aborting because of large number of records
Moderators: chulett, rschirm, roy
There's an option to use a Trigger Column in the XML Output stage, when the value in that column changes the stage closes the current file and opens a new output file.
Last edited by chulett on Thu Mar 04, 2010 3:15 pm, edited 1 time in total.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
I tried that but dint work because in there I have to select a column to trigger.Every field has different records & it is not pssible to do like that in my job.
Instead I tried @INROWNUM function in transformer stage.
I gave as constraint for first file as @INROWNUM < 400000 & for second link I gave as @INROWNUM >= 400000 .
Still,it is trying to insert more than 400000 records to first file. I am not sure where I am going wrong.
Instead I tried @INROWNUM function in transformer stage.
I gave as constraint for first file as @INROWNUM < 400000 & for second link I gave as @INROWNUM >= 400000 .
Still,it is trying to insert more than 400000 records to first file. I am not sure where I am going wrong.