How to handle Blob ?
Moderators: chulett, rschirm, roy
How to handle Blob ?
Hi,
Please suggest me how to handle blob column? Its working fine if it is a direct transfer from one table to another table but its throwing error if we use transformer stage. I tried to use to_char(BLOB_Column) but it didn't work out as well. Any input ???
Thanks,
Regards,
Lohith Sama.
Please suggest me how to handle blob column? Its working fine if it is a direct transfer from one table to another table but its throwing error if we use transformer stage. I tried to use to_char(BLOB_Column) but it didn't work out as well. Any input ???
Thanks,
Regards,
Lohith Sama.
Bhanu
-
- Participant
- Posts: 527
- Joined: Thu Apr 19, 2007 1:25 am
- Location: Melbourne
Do you need to do anything to the BLOB in the Transformer Stage?
If not, you could just use a Copy Stage to split it (and a copy of the PK) out from the rest, put the rest through the Transformer, then join them back up together afterwards.
If you do need to do something with it, you would have to convert it. You may need to split your BLOB into several fields if it's big enough (and hope what you are looking for wasn't split).
If not, you could just use a Copy Stage to split it (and a copy of the PK) out from the rest, put the rest through the Transformer, then join them back up together afterwards.
If you do need to do something with it, you would have to convert it. You may need to split your BLOB into several fields if it's big enough (and hope what you are looking for wasn't split).
Find the max length of your BLOB. Most of the time you can use varchar(8000) or larger if your max length is less than 8000. There are limits on record size if you are using ODBC. Do a search there are posts on raising this limit. If you are using connectors then you need to research the limit probably on IBM.com.
Mamu Kim
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Depends what kind of BLOB.
Technology is marching on. For example you might filter face pictures on whether or not they have spectacles (via Java calls).
The new InfoSphere Streams connectivity may do some kind of what is effectively BLOB processing too.
In general, though, it is still wise to consider that BLOB data type remains unsupported.
CLOB is a bit easier.
Technology is marching on. For example you might filter face pictures on whether or not they have spectacles (via Java calls).
The new InfoSphere Streams connectivity may do some kind of what is effectively BLOB processing too.
In general, though, it is still wise to consider that BLOB data type remains unsupported.
CLOB is a bit easier.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Hi,
I am getting following error when I was using copy stage to handle blob data.
"Copy_18,1: Fatal Error: Virtual data set.; output of "Copy_18": the record is too big to fit in a block; the length requested is: 139266, the max block length is: 131072.". Could you please give advise how to resolve this?
Thank you,
I am getting following error when I was using copy stage to handle blob data.
"Copy_18,1: Fatal Error: Virtual data set.; output of "Copy_18": the record is too big to fit in a block; the length requested is: 139266, the max block length is: 131072.". Could you please give advise how to resolve this?
Thank you,
Bhanu
-
- Premium Member
- Posts: 353
- Joined: Mon Jan 17, 2011 5:03 am
- Location: Mumbai, India
I am no more getting "The record is too big to fit in a block" since I increased the value of APT_DEFAULT_TRANSPORT_BLOCK_SIZE. I am getting below error after loading 3500-3900 rows out of 8000 rows. Please suggest me a way to handle this error.
Oracle_Connector_0,0: Caught exception from runLocally(): APT_BadAlloc: Heap allocation failed..
Thanks,
Oracle_Connector_0,0: Caught exception from runLocally(): APT_BadAlloc: Heap allocation failed..
Thanks,
Bhanu