DataSet storage size

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
reachmexyz
Premium Member
Premium Member
Posts: 296
Joined: Sun Nov 16, 2008 7:41 pm

DataSet storage size

Post by reachmexyz »

Hi All

I have a process where i read the data from file, do a lookup and create a dataset with columnd extracted from lookup. Most of the columns extracted from lookup are "UNICODE 4000". Becasue of this i am exceeding the data limit stored in a dataet and getting the below error.
How can i increase the storage size of the dataset.

Lookup_52,1: Fatal Error: File data set, file "/app/DataStage/Rolls/work/pfeed.ds".; output of "APT_LUTProcessOp in Lookup_52": the record is too big to fit in a block; the length requested is: 133634, the max block length is: 131072.
vinothkumar
Participant
Posts: 342
Joined: Tue Nov 04, 2008 10:38 am
Location: Chennai, India

Post by vinothkumar »

Please do a search for this error message. This has been discussed..
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

In particular look for the environment variable that controls maximum block size in a Data Set.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply