clob data with a length of 2 GB

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
devnhi
Premium Member
Premium Member
Posts: 68
Joined: Wed Jun 17, 2009 10:47 am

clob data with a length of 2 GB

Post by devnhi »

Hi

we are getting data from CLOB column from db2 in flat files with a column of length 2 GB . we were trying to read this data into datastage .
we were able to read column with a length of 200 MB successfully,but after that we were receiving failure.In the data , for that column - we defined it as LongVarchar and did not specified any length .but no luck.

we were trying with the values of :

$APT_DEFAULT_TRANSPORT_BLOCK_SIZE = 268435456
$APT_MAX_DELIMITED_READ_SIZE = 608252184
$APT_MAX_TRANSPORT_BLOCK_SIZE = 1048576


Can somebody guide us in reading data this big values like this ? Thanks for all your inputs.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Post your failure message(s).
-craig

"You can never have too many knives" -- Logan Nine Fingers
devnhi
Premium Member
Premium Member
Posts: 68
Joined: Wed Jun 17, 2009 10:47 am

Post by devnhi »

Sorry yeah , this is what I am getting .


Sequential_File_486,0: Consumed more than 1345550119 bytes looking for record delimiter; aborting
devnhi
Premium Member
Premium Member
Posts: 68
Joined: Wed Jun 17, 2009 10:47 am

Post by devnhi »

If i try these numbers i am getting different error message as :

$APT_DEFAULT_TRANSPORT_BLOCK_SIZE = 268435456
$APT_MAX_DELIMITED_READ_SIZE = 1345550119
$APT_MAX_TRANSPORT_BLOCK_SIZE = 1048576

Sequential_File_486,0: Fatal Error: Virtual data set.; output of "Sequential_File_486": the record is too big to fit in a block; the length requested is: 350000005, the max block length is: 268435456.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

That latest message tells you exactly what the problem is and gives you the numbers you need to fix it. Well, if not "fix" then at least enough to move on to the next problem. :wink:
-craig

"You can never have too many knives" -- Logan Nine Fingers
devnhi
Premium Member
Premium Member
Posts: 68
Joined: Wed Jun 17, 2009 10:47 am

Post by devnhi »

When I tried increasing it - it errored out saying :


Sequential_File_486,0: Fatal Error: Virtual data set.; output of "Sequential_File_486": the record is too big to fit in a block; the length requested is: 350000005, the max block length is: 131072.

And this maximum block length is less than what I have defined earlier 268435456.I was able to successfully read the file with this value is on .But if I raise this value - I am getting problem.

Are there any limits defined on theese variables ? Where can I find them ?
Are there any other variables I can try of . we are going to get some data which are having column lengths more than 2 GB.

Any help would be really appreciated.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

They are environment variables set for your project, accessible through the Administrator client. Search DSXchange for names such as APT_TRANSPORT_BLOCK_SIZE and APT_BUFFER_MAXSIZE, and then research these in the manuals or the IBM Information Center (on-line help).
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply