hi,
I am little confused here could someone help!
I ran a small test where read 1816 char using seq file-->transformer-->seq file.
In the transformer stage i kept a stage variable whole length is blank, i wanted to find out whether the ds will assign 255 as default legth if not specified or the job will failled if i pass more char.
In the test the job finished succesfully.
I am interested in finding out if i keep the stage variable length blank am i allowing the DS to take the maximum length if yes, whats the max length?
Is there any admin variable which governs this length usage at project level?
I search the forum a bit but couldn't found any relevant topic, if you know one please provide the pointer.
stage variable max length
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 42
- Joined: Thu Dec 11, 2008 11:07 am
stage variable max length
Thanks,
Aruna
Aruna
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
-
- Participant
- Posts: 42
- Joined: Thu Dec 11, 2008 11:07 am
Thanks Ray for guiding
i tried to run an experiment to confirm the theory, the job is changed to row generator-->transformer-->dataset
Sequential file stage failled with buffer size insufficent error.
though i have given unbounded at stage variable the job is aborting with
APT_CombinedOperatorController,3: Fatal Error: File data set, file "{0}".; output of "APT_TransformOperatorImplV1S2_volumetest_Transformer_3 in Transformer_3": the record is too big to fit in a block; the length requested is: 4925006.
Increasing APT_DEFAULT_TRANSPORT_BLOCK_SIZE dosen't helped!
May be stage variable is unbounded but other configurations will limit the job after certain amount.
So i am comming up with the result that though you have unbounded stage variable, it will be best to check the volume of the data you would be dealing with and scalability of your project.
Please share your thoughts if you think otherwise.
i tried to run an experiment to confirm the theory, the job is changed to row generator-->transformer-->dataset
Sequential file stage failled with buffer size insufficent error.
though i have given unbounded at stage variable the job is aborting with
APT_CombinedOperatorController,3: Fatal Error: File data set, file "{0}".; output of "APT_TransformOperatorImplV1S2_volumetest_Transformer_3 in Transformer_3": the record is too big to fit in a block; the length requested is: 4925006.
Increasing APT_DEFAULT_TRANSPORT_BLOCK_SIZE dosen't helped!
May be stage variable is unbounded but other configurations will limit the job after certain amount.
So i am comming up with the result that though you have unbounded stage variable, it will be best to check the volume of the data you would be dealing with and scalability of your project.
Please share your thoughts if you think otherwise.
Thanks,
Aruna
Aruna
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
What value did you use for the default block size? Did you look at any of the other transport block size settings?
You might also look at the BUFFER environment variables; it may be one of these that is filling. By default you have pairs of buffers, each 3MB, with switching occurring at 50% full.
You might also look at the BUFFER environment variables; it may be one of these that is filling. By default you have pairs of buffers, each 3MB, with switching occurring at 50% full.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Participant
- Posts: 42
- Joined: Thu Dec 11, 2008 11:07 am