is there any way to increase the inter stag rowbuffer?

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

waklook
Charter Member
Charter Member
Posts: 31
Joined: Sat Dec 10, 2005 4:13 am
Contact:

is there any way to increase the inter stag rowbuffer?

Post by waklook »

Hi,

i have this error while activating inter process rowbuffer,
" row too big for inter stage rowbuffer ",
i searched the forum but it did not solved my problem.

my ques is how can i increase the roubuffer size to 20mb, bcz my row size is like 20mb, is there any setting or configration i can play with?

thanks in advance.
kumar_s
Charter Member
Charter Member
Posts: 5245
Joined: Thu Jun 16, 2005 11:00 pm

Post by kumar_s »

Performance tab in Job properties. Or at Admistrator client for project defaults.
Impossible doesn't mean 'it is not possible' actually means... 'NOBODY HAS DONE IT SO FAR'
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

20MB per ROW ?!!!

Maybe some kind of re-design is in order.

That is an utterly ridiculous row size. What's in a row? Do you really need to move it all together? Can you move part rows? Or even not move some parts through DataStage at all? Anything that doesn't need transformation can be moved outside of DataStage, whether or not under DataStage control.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
waklook
Charter Member
Charter Member
Posts: 31
Joined: Sat Dec 10, 2005 4:13 am
Contact:

Post by waklook »

Hi again

Ray

say it's 5mb how can i do that?
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

So... you're just making these sizes up? :?

Kumar has already answered your question as asked. Ray has attempted to bring some sanity to it.
-craig

"You can never have too many knives" -- Logan Nine Fingers
waklook
Charter Member
Charter Member
Posts: 31
Joined: Sat Dec 10, 2005 4:13 am
Contact:

Post by waklook »

Sorry guys

i'm not making these sizes up?

i have realy 17,296kb file size and many files that are more than 15,000kb
and the size increases litle bit every load.(monthly).

i know about Kumar answer , the max size availble is 1024kb , i'm only asking if there any other solution.

sorry if i did or said something wrong.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

File sizes are one thing, you were talking about 20MB ROW sizes. That's a completely different animal. What kind of average row size are you dealing with?
-craig

"You can never have too many knives" -- Logan Nine Fingers
mctny
Charter Member
Charter Member
Posts: 166
Joined: Thu Feb 02, 2006 6:55 am

Post by mctny »

Hi Waklook,

are your files one line file, or are you trying to read them in one row? maybe there is a problem with your EOL characters in your file
Thanks,
Chad
__________________________________________________________________
"There are three kinds of people in this world; Ones who know how to count and the others who don't know how to count !"
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

The reason there's a 1MB upper limit on row buffers is that no-one figured anyone would need anywhere near that. The default limit for ODBC connections, for example, is 8KB per row.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
kumar_s
Charter Member
Charter Member
Posts: 5245
Joined: Thu Jun 16, 2005 11:00 pm

Post by kumar_s »

Or is it any kind of 'large objects' involved?
Impossible doesn't mean 'it is not possible' actually means... 'NOBODY HAS DONE IT SO FAR'
waklook
Charter Member
Charter Member
Posts: 31
Joined: Sat Dec 10, 2005 4:13 am
Contact:

Post by waklook »

Hi,

thanks to all of you guys for your replys,

i have xml files with max size around 20mb,
when i use the link partitioner and link collector i have to activate the inter process rowbuffer, at this point it will move every file as one record and bcz the file is too big(row) i have that error,may be i'm wrong understanding it that way.
i have tryed using the file path but i failed. this month i have > 700,000 xml file and it takes >40H to finish loading. i have some other development going on, at the same time i'm trying to improve the performance for this one and i failed,i'm asking you guys to help me on that by asking these questions.

thanks any way to all u.
kumar_s
Charter Member
Charter Member
Posts: 5245
Joined: Thu Jun 16, 2005 11:00 pm

Post by kumar_s »

Dont you have any transformation involved in the XML file?
If not, why can you move it outside of Datastage, like FTP etc.,
Other options is to parse the XML file using XML readers with sepecific metadata.
Impossible doesn't mean 'it is not possible' actually means... 'NOBODY HAS DONE IT SO FAR'
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

It always helps to explain what you are doing rather than ask a single targeted question with nothing to give it context. :wink:

The size of the XML file shouldn't be an issue. I've processed XML files that are hundreds of megabytes in size without issue and without buffering or IPC stages or any other hooey like that. I've also done up to 500,000 'at one time' as well.

Describe your job design in detail, please. The first thing you should be doing is parsing the XML and flattening it into rows, not passing it whole down your job. For example:

Folder -> XML Input -> everything else

The folder stage will want to move the file as one large record by default, but that's simple to override. Only create one field in the Folder to return the filename and then set the XML Input stage to use the URL/File path column content option on the XML Source tab. Then it reads the XML file directly rather than taking it as one huge record from the Folder stage and the 'everything else' part of your job simply passes normal rows downstream.
-craig

"You can never have too many knives" -- Logan Nine Fingers
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Had you mentioned XML in the original post, you would have received that advice much earlier.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Exactly.
-craig

"You can never have too many knives" -- Logan Nine Fingers
Post Reply