column limit in DataStage

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
vskumar08
Participant
Posts: 13
Joined: Fri May 04, 2007 12:51 am

column limit in DataStage

Post by vskumar08 »

Is there a limit on the number of columns that can be read by a DataStage.
First, I used Oracle Stage to read a table with 600 columns but could not able to view data in that stage.

I could able to view data of tables with 200 columns.

Can anyone suggest me whether there are any settings to be made to increase the number of columns read by DataStage or the Column limit.
bullabai vachadu
nick.bond
Charter Member
Charter Member
Posts: 230
Joined: Thu Jan 15, 2004 12:00 pm
Location: London

Post by nick.bond »

have you tried running the job? it may just be a limit for viewing within datastage.
Regards,

Nick.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

It depends on a number of factors. For example, if you are reading via ODBC then the limit is 400 columns by default, though this can be changed by specifying MAXFETCHCOLS in the uvodbc.config file at least for that data source. You might also need to increase MAXFETCHBUFF, which defaults to 8KB per row.

It may also be that there's a hard coded limit in the data browser, though I've not seen any documentation of a limit here.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
vskumar08
Participant
Posts: 13
Joined: Fri May 04, 2007 12:51 am

Post by vskumar08 »

ray.wurlod wrote:It depends on a number of factors. For example, if you are reading via ODBC then the limit is 400 columns by default, though this can be changed by specifying MAXFETCHCOLS in the uvodbc.config [/i ...


Thanks mate, it worked
bullabai vachadu
vskumar08
Participant
Posts: 13
Joined: Fri May 04, 2007 12:51 am

increasing External datasource Record Size in DataStage

Post by vskumar08 »

vskumar08 wrote:
ray.wurlod wrote:It depends on a number of factors. For example, if you are reading via ODBC then the limit is 400 columns by default, though this can be changed by specifying MAXFETCHCOLS in the uvodbc.config [/i ...


okay, Can anyone tell me how to increase External datasource Record Size in DataStage which has a limit of 32K by default.
bullabai vachadu
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

What precisely do you mean by external datasource in this context (for there's no such terminology in the product as far as I am aware), and where is this record size limit documented?
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
vskumar08
Participant
Posts: 13
Joined: Fri May 04, 2007 12:51 am

Post by vskumar08 »

ray.wurlod wrote:What precisely do you mean by external datasource in this context (for there's no such terminology in the product as far as I am aware), and where is this record size limit documented?
I am using an Oracle Enterprise Stage to extract data from oracle 10g, but getting the error as FATAL ERROR: Virtual data Set;.....; the record is too big to fit in a block; the length requested is : 135012.

I previously thought the problem may be with the number of columns as the columns are 600 and when i reduced the number of columns to 400, it was working fine.

I have updated the uvodbc.config as Mr. Ray has suggested but this is working fine when the size of the columns were reduced( previously size is 255 varchar, for many columns) with the number of columns to be 600

so my observation is that, it is the record length which is the problem in here and not the number of columns.

----can any environment variable be updated inorder to over come this????

I was going through odbcref.pdf where this topic has been mentioned EXTERNAL DATASOURCE RECORD SIZE in DATASTAGE IS 32K and if the record is larger than that, DataStage returns an error and terminates the application.pg.8 , but this may be for ODBC enterprise stage.

Please advise......
bullabai vachadu
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

The Oracle Enterprise stage does not use ODBC, so anything you found there is probably not relevant.

It remains unclear whether the block size referred to here is something inside DataStage (such as buffer size) or something in Oracle or the connection between Oracle client and database server.

It looks like the record size is destined for a block size smaller than 135012 bytes (131072 is 128KB). Look for any buffer size (in DataStage or in Oracle) that might be set to 128KB and try increasing that.

Only change one thing at a time.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
vskumar08
Participant
Posts: 13
Joined: Fri May 04, 2007 12:51 am

Post by vskumar08 »

ray.wurlod wrote:The Oracle Enterprise stage does not use ODBC, so anything you found there is probably not relevant.

It remains unclear whether the block size referred to here is something inside DataStage (such a ...
whatever, I am new to DataStage...so it is very natural that I make some mistakes.

the question is, I could not able to work with tables with huge data in DataStage, and I am using Oracle Database. Any solutions??????
Thanks
bullabai vachadu
nick.bond
Charter Member
Charter Member
Posts: 230
Joined: Thu Jan 15, 2004 12:00 pm
Location: London

Post by nick.bond »

Out of curiosity what is the source you have with 600 columns?
Regards,

Nick.
vskumar08
Participant
Posts: 13
Joined: Fri May 04, 2007 12:51 am

Post by vskumar08 »

nick.bond wrote:Out of curiosity what is the source you have with 600 columns?
Error: ##F TFDR 000043 13:36:21(001) <Oracle_Enterprise_3,0> Fatal Error: Virtual data set.; output of "Oracle_Enterprise_3": the record is too big to fit in a block; the length requested is: 134048.

any suggestion???
bullabai vachadu
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Please advise how this problem was resolved, so anyone encountering it in future can be helped when they search.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
JoshGeorge
Participant
Posts: 612
Joined: Thu May 03, 2007 4:59 am
Location: Melbourne

Post by JoshGeorge »

OP made multiple posts and issue was resolved in other one :

http://dsxchange.com/viewtopic.php?p=23 ... 8e8#236800


Resolution suggested :

APT_DEFAULT_TRANSPORT_BLOCK_SIZE need to be set to the double of maximum possible message size that is expected
Joshy George
<a href="http://www.linkedin.com/in/joshygeorge1" ><img src="http://www.linkedin.com/img/webpromo/bt ... _80x15.gif" width="80" height="15" border="0"></a>
vskumar08
Participant
Posts: 13
Joined: Fri May 04, 2007 12:51 am

Post by vskumar08 »

JoshGeorge wrote:OP made multiple posts and issue was resolved in other one :

http://dsxchange.com/viewtopic.php?p=23 ... 8e8#236800


Resolution suggested :

APT_DEFAULT_TRANSPORT_BLOCK_SIZE need to be set to the double of maximum possible message size that is expected
Yes, Josh's solution is right. Thanks Josh
bullabai vachadu
Post Reply