Page 1 of 1

stage variable max length

Posted: Wed Sep 14, 2011 12:09 pm
by ArunaDas_Maharana
hi,

I am little confused here could someone help!

I ran a small test where read 1816 char using seq file-->transformer-->seq file.

In the transformer stage i kept a stage variable whole length is blank, i wanted to find out whether the ds will assign 255 as default legth if not specified or the job will failled if i pass more char.

In the test the job finished succesfully.

I am interested in finding out if i keep the stage variable length blank am i allowing the DS to take the maximum length if yes, whats the max length?

Is there any admin variable which governs this length usage at project level?

I search the forum a bit but couldn't found any relevant topic, if you know one please provide the pointer.

Posted: Wed Sep 14, 2011 3:33 pm
by ray.wurlod
The magic word is probably "unbounded".

Posted: Wed Sep 14, 2011 5:17 pm
by ArunaDas_Maharana
Thanks Ray for guiding

i tried to run an experiment to confirm the theory, the job is changed to row generator-->transformer-->dataset

Sequential file stage failled with buffer size insufficent error.

though i have given unbounded at stage variable the job is aborting with

APT_CombinedOperatorController,3: Fatal Error: File data set, file "{0}".; output of "APT_TransformOperatorImplV1S2_volumetest_Transformer_3 in Transformer_3": the record is too big to fit in a block; the length requested is: 4925006.

Increasing APT_DEFAULT_TRANSPORT_BLOCK_SIZE dosen't helped!

May be stage variable is unbounded but other configurations will limit the job after certain amount.

So i am comming up with the result that though you have unbounded stage variable, it will be best to check the volume of the data you would be dealing with and scalability of your project.

Please share your thoughts if you think otherwise.

Posted: Thu Sep 15, 2011 1:50 am
by ray.wurlod
What value did you use for the default block size? Did you look at any of the other transport block size settings?

You might also look at the BUFFER environment variables; it may be one of these that is filling. By default you have pairs of buffers, each 3MB, with switching occurring at 50% full.

Posted: Thu Sep 15, 2011 6:42 pm
by ArunaDas_Maharana
Well I used for apt block size the actual bytes it was looking for 4925006

APT_BUFFER_DISK_WRITE_INCREMENT=1048576
APT_IO_MAXIMUM_OUTSTANDING=2097152

the job fails while writing out to the file system or the dataset but not at transformer level where stage variable in unbounded.