stage variable max length

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
ArunaDas_Maharana
Participant
Posts: 42
Joined: Thu Dec 11, 2008 11:07 am

stage variable max length

Post by ArunaDas_Maharana »

hi,

I am little confused here could someone help!

I ran a small test where read 1816 char using seq file-->transformer-->seq file.

In the transformer stage i kept a stage variable whole length is blank, i wanted to find out whether the ds will assign 255 as default legth if not specified or the job will failled if i pass more char.

In the test the job finished succesfully.

I am interested in finding out if i keep the stage variable length blank am i allowing the DS to take the maximum length if yes, whats the max length?

Is there any admin variable which governs this length usage at project level?

I search the forum a bit but couldn't found any relevant topic, if you know one please provide the pointer.
Thanks,
Aruna
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

The magic word is probably "unbounded".
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
ArunaDas_Maharana
Participant
Posts: 42
Joined: Thu Dec 11, 2008 11:07 am

Post by ArunaDas_Maharana »

Thanks Ray for guiding

i tried to run an experiment to confirm the theory, the job is changed to row generator-->transformer-->dataset

Sequential file stage failled with buffer size insufficent error.

though i have given unbounded at stage variable the job is aborting with

APT_CombinedOperatorController,3: Fatal Error: File data set, file "{0}".; output of "APT_TransformOperatorImplV1S2_volumetest_Transformer_3 in Transformer_3": the record is too big to fit in a block; the length requested is: 4925006.

Increasing APT_DEFAULT_TRANSPORT_BLOCK_SIZE dosen't helped!

May be stage variable is unbounded but other configurations will limit the job after certain amount.

So i am comming up with the result that though you have unbounded stage variable, it will be best to check the volume of the data you would be dealing with and scalability of your project.

Please share your thoughts if you think otherwise.
Thanks,
Aruna
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

What value did you use for the default block size? Did you look at any of the other transport block size settings?

You might also look at the BUFFER environment variables; it may be one of these that is filling. By default you have pairs of buffers, each 3MB, with switching occurring at 50% full.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
ArunaDas_Maharana
Participant
Posts: 42
Joined: Thu Dec 11, 2008 11:07 am

Post by ArunaDas_Maharana »

Well I used for apt block size the actual bytes it was looking for 4925006

APT_BUFFER_DISK_WRITE_INCREMENT=1048576
APT_IO_MAXIMUM_OUTSTANDING=2097152

the job fails while writing out to the file system or the dataset but not at transformer level where stage variable in unbounded.
Thanks,
Aruna
Post Reply