Hi
we are getting data from CLOB column from db2 in flat files with a column of length 2 GB . we were trying to read this data into datastage .
we were able to read column with a length of 200 MB successfully,but after that we were receiving failure.In the data , for that column - we defined it as LongVarchar and did not specified any length .but no luck.
we were trying with the values of :
$APT_DEFAULT_TRANSPORT_BLOCK_SIZE = 268435456
$APT_MAX_DELIMITED_READ_SIZE = 608252184
$APT_MAX_TRANSPORT_BLOCK_SIZE = 1048576
Can somebody guide us in reading data this big values like this ? Thanks for all your inputs.
clob data with a length of 2 GB
Moderators: chulett, rschirm, roy
If i try these numbers i am getting different error message as :
$APT_DEFAULT_TRANSPORT_BLOCK_SIZE = 268435456
$APT_MAX_DELIMITED_READ_SIZE = 1345550119
$APT_MAX_TRANSPORT_BLOCK_SIZE = 1048576
Sequential_File_486,0: Fatal Error: Virtual data set.; output of "Sequential_File_486": the record is too big to fit in a block; the length requested is: 350000005, the max block length is: 268435456.
$APT_DEFAULT_TRANSPORT_BLOCK_SIZE = 268435456
$APT_MAX_DELIMITED_READ_SIZE = 1345550119
$APT_MAX_TRANSPORT_BLOCK_SIZE = 1048576
Sequential_File_486,0: Fatal Error: Virtual data set.; output of "Sequential_File_486": the record is too big to fit in a block; the length requested is: 350000005, the max block length is: 268435456.
When I tried increasing it - it errored out saying :
Sequential_File_486,0: Fatal Error: Virtual data set.; output of "Sequential_File_486": the record is too big to fit in a block; the length requested is: 350000005, the max block length is: 131072.
And this maximum block length is less than what I have defined earlier 268435456.I was able to successfully read the file with this value is on .But if I raise this value - I am getting problem.
Are there any limits defined on theese variables ? Where can I find them ?
Are there any other variables I can try of . we are going to get some data which are having column lengths more than 2 GB.
Any help would be really appreciated.
Sequential_File_486,0: Fatal Error: Virtual data set.; output of "Sequential_File_486": the record is too big to fit in a block; the length requested is: 350000005, the max block length is: 131072.
And this maximum block length is less than what I have defined earlier 268435456.I was able to successfully read the file with this value is on .But if I raise this value - I am getting problem.
Are there any limits defined on theese variables ? Where can I find them ?
Are there any other variables I can try of . we are going to get some data which are having column lengths more than 2 GB.
Any help would be really appreciated.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
They are environment variables set for your project, accessible through the Administrator client. Search DSXchange for names such as APT_TRANSPORT_BLOCK_SIZE and APT_BUFFER_MAXSIZE, and then research these in the manuals or the IBM Information Center (on-line help).
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.