Page 1 of 1

How to handle Blob ?

Posted: Wed Dec 12, 2012 12:41 pm
by bond88
Hi,
Please suggest me how to handle blob column? Its working fine if it is a direct transfer from one table to another table but its throwing error if we use transformer stage. I tried to use to_char(BLOB_Column) but it didn't work out as well. Any input ???

Thanks,

Regards,
Lohith Sama.

Posted: Wed Dec 12, 2012 4:42 pm
by stuartjvnorton
Do you need to do anything to the BLOB in the Transformer Stage?

If not, you could just use a Copy Stage to split it (and a copy of the PK) out from the rest, put the rest through the Transformer, then join them back up together afterwards.

If you do need to do something with it, you would have to convert it. You may need to split your BLOB into several fields if it's big enough (and hope what you are looking for wasn't split).

Posted: Wed Dec 12, 2012 7:30 pm
by bond88
Thanks,

I will try copy stage. At this point there is no need to do anything with BLOB. I just have a new doubt if we want to do some transformation with BLOB how can we split it?

Posted: Wed Dec 12, 2012 9:58 pm
by kduke
Find the max length of your BLOB. Most of the time you can use varchar(8000) or larger if your max length is less than 8000. There are limits on record size if you are using ODBC. Do a search there are posts on raising this limit. If you are using connectors then you need to research the limit probably on IBM.com.

Posted: Wed Dec 12, 2012 10:18 pm
by chulett
Sorry but what kind of "transformations" could you possibly do with a BLOB? :?

Posted: Wed Dec 12, 2012 11:37 pm
by ray.wurlod
Depends what kind of BLOB.

Technology is marching on. For example you might filter face pictures on whether or not they have spectacles (via Java calls).

The new InfoSphere Streams connectivity may do some kind of what is effectively BLOB processing too.

In general, though, it is still wise to consider that BLOB data type remains unsupported.

CLOB is a bit easier.

Posted: Thu Dec 13, 2012 3:53 pm
by bond88
Hi,
I am getting following error when I was using copy stage to handle blob data.
"Copy_18,1: Fatal Error: Virtual data set.; output of "Copy_18": the record is too big to fit in a block; the length requested is: 139266, the max block length is: 131072.". Could you please give advise how to resolve this?

Thank you,

Posted: Thu Dec 13, 2012 4:06 pm
by chulett
Do an exact search here for "the record is too big to fit in a block".

Posted: Thu Dec 13, 2012 11:30 pm
by chandra.shekhar@tcs.com
Increase the value of APT_DEFAULT_TRANSPORT_BLOCK_SIZE.

Posted: Mon Jul 15, 2013 1:44 pm
by bond88
I am no more getting "The record is too big to fit in a block" since I increased the value of APT_DEFAULT_TRANSPORT_BLOCK_SIZE. I am getting below error after loading 3500-3900 rows out of 8000 rows. Please suggest me a way to handle this error.

Oracle_Connector_0,0: Caught exception from runLocally(): APT_BadAlloc: Heap allocation failed..

Thanks,

Posted: Mon Jul 15, 2013 1:59 pm
by chulett
Did you try doing an exact search for "APT_BadAlloc: Heap allocation failed"? I got 64 matches including yours and many were marked as resolved... hopefully something in there will be helpful.