is there any way to increase the inter stag rowbuffer?
Moderators: chulett, rschirm, roy
is there any way to increase the inter stag rowbuffer?
Hi,
i have this error while activating inter process rowbuffer,
" row too big for inter stage rowbuffer ",
i searched the forum but it did not solved my problem.
my ques is how can i increase the roubuffer size to 20mb, bcz my row size is like 20mb, is there any setting or configration i can play with?
thanks in advance.
i have this error while activating inter process rowbuffer,
" row too big for inter stage rowbuffer ",
i searched the forum but it did not solved my problem.
my ques is how can i increase the roubuffer size to 20mb, bcz my row size is like 20mb, is there any setting or configration i can play with?
thanks in advance.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
20MB per ROW ?!!!
Maybe some kind of re-design is in order.
That is an utterly ridiculous row size. What's in a row? Do you really need to move it all together? Can you move part rows? Or even not move some parts through DataStage at all? Anything that doesn't need transformation can be moved outside of DataStage, whether or not under DataStage control.
Maybe some kind of re-design is in order.
That is an utterly ridiculous row size. What's in a row? Do you really need to move it all together? Can you move part rows? Or even not move some parts through DataStage at all? Anything that doesn't need transformation can be moved outside of DataStage, whether or not under DataStage control.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Sorry guys
i'm not making these sizes up?
i have realy 17,296kb file size and many files that are more than 15,000kb
and the size increases litle bit every load.(monthly).
i know about Kumar answer , the max size availble is 1024kb , i'm only asking if there any other solution.
sorry if i did or said something wrong.
i'm not making these sizes up?
i have realy 17,296kb file size and many files that are more than 15,000kb
and the size increases litle bit every load.(monthly).
i know about Kumar answer , the max size availble is 1024kb , i'm only asking if there any other solution.
sorry if i did or said something wrong.
Hi Waklook,
are your files one line file, or are you trying to read them in one row? maybe there is a problem with your EOL characters in your file
are your files one line file, or are you trying to read them in one row? maybe there is a problem with your EOL characters in your file
Thanks,
Chad
__________________________________________________________________
"There are three kinds of people in this world; Ones who know how to count and the others who don't know how to count !"
Chad
__________________________________________________________________
"There are three kinds of people in this world; Ones who know how to count and the others who don't know how to count !"
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
The reason there's a 1MB upper limit on row buffers is that no-one figured anyone would need anywhere near that. The default limit for ODBC connections, for example, is 8KB per row.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Hi,
thanks to all of you guys for your replys,
i have xml files with max size around 20mb,
when i use the link partitioner and link collector i have to activate the inter process rowbuffer, at this point it will move every file as one record and bcz the file is too big(row) i have that error,may be i'm wrong understanding it that way.
i have tryed using the file path but i failed. this month i have > 700,000 xml file and it takes >40H to finish loading. i have some other development going on, at the same time i'm trying to improve the performance for this one and i failed,i'm asking you guys to help me on that by asking these questions.
thanks any way to all u.
thanks to all of you guys for your replys,
i have xml files with max size around 20mb,
when i use the link partitioner and link collector i have to activate the inter process rowbuffer, at this point it will move every file as one record and bcz the file is too big(row) i have that error,may be i'm wrong understanding it that way.
i have tryed using the file path but i failed. this month i have > 700,000 xml file and it takes >40H to finish loading. i have some other development going on, at the same time i'm trying to improve the performance for this one and i failed,i'm asking you guys to help me on that by asking these questions.
thanks any way to all u.
It always helps to explain what you are doing rather than ask a single targeted question with nothing to give it context.
The size of the XML file shouldn't be an issue. I've processed XML files that are hundreds of megabytes in size without issue and without buffering or IPC stages or any other hooey like that. I've also done up to 500,000 'at one time' as well.
Describe your job design in detail, please. The first thing you should be doing is parsing the XML and flattening it into rows, not passing it whole down your job. For example:
Folder -> XML Input -> everything else
The folder stage will want to move the file as one large record by default, but that's simple to override. Only create one field in the Folder to return the filename and then set the XML Input stage to use the URL/File path column content option on the XML Source tab. Then it reads the XML file directly rather than taking it as one huge record from the Folder stage and the 'everything else' part of your job simply passes normal rows downstream.
The size of the XML file shouldn't be an issue. I've processed XML files that are hundreds of megabytes in size without issue and without buffering or IPC stages or any other hooey like that. I've also done up to 500,000 'at one time' as well.
Describe your job design in detail, please. The first thing you should be doing is parsing the XML and flattening it into rows, not passing it whole down your job. For example:
Folder -> XML Input -> everything else
The folder stage will want to move the file as one large record by default, but that's simple to override. Only create one field in the Folder to return the filename and then set the XML Input stage to use the URL/File path column content option on the XML Source tab. Then it reads the XML file directly rather than taking it as one huge record from the Folder stage and the 'everything else' part of your job simply passes normal rows downstream.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact: