How to handle Blob ?

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
bond88
Participant
Posts: 109
Joined: Mon Oct 15, 2012 10:05 am
Location: USA

How to handle Blob ?

Post by bond88 »

Hi,
Please suggest me how to handle blob column? Its working fine if it is a direct transfer from one table to another table but its throwing error if we use transformer stage. I tried to use to_char(BLOB_Column) but it didn't work out as well. Any input ???

Thanks,

Regards,
Lohith Sama.
Bhanu
stuartjvnorton
Participant
Posts: 527
Joined: Thu Apr 19, 2007 1:25 am
Location: Melbourne

Post by stuartjvnorton »

Do you need to do anything to the BLOB in the Transformer Stage?

If not, you could just use a Copy Stage to split it (and a copy of the PK) out from the rest, put the rest through the Transformer, then join them back up together afterwards.

If you do need to do something with it, you would have to convert it. You may need to split your BLOB into several fields if it's big enough (and hope what you are looking for wasn't split).
bond88
Participant
Posts: 109
Joined: Mon Oct 15, 2012 10:05 am
Location: USA

Post by bond88 »

Thanks,

I will try copy stage. At this point there is no need to do anything with BLOB. I just have a new doubt if we want to do some transformation with BLOB how can we split it?
Bhanu
kduke
Charter Member
Charter Member
Posts: 5227
Joined: Thu May 29, 2003 9:47 am
Location: Dallas, TX
Contact:

Post by kduke »

Find the max length of your BLOB. Most of the time you can use varchar(8000) or larger if your max length is less than 8000. There are limits on record size if you are using ODBC. Do a search there are posts on raising this limit. If you are using connectors then you need to research the limit probably on IBM.com.
Mamu Kim
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Sorry but what kind of "transformations" could you possibly do with a BLOB? :?
-craig

"You can never have too many knives" -- Logan Nine Fingers
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Depends what kind of BLOB.

Technology is marching on. For example you might filter face pictures on whether or not they have spectacles (via Java calls).

The new InfoSphere Streams connectivity may do some kind of what is effectively BLOB processing too.

In general, though, it is still wise to consider that BLOB data type remains unsupported.

CLOB is a bit easier.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
bond88
Participant
Posts: 109
Joined: Mon Oct 15, 2012 10:05 am
Location: USA

Post by bond88 »

Hi,
I am getting following error when I was using copy stage to handle blob data.
"Copy_18,1: Fatal Error: Virtual data set.; output of "Copy_18": the record is too big to fit in a block; the length requested is: 139266, the max block length is: 131072.". Could you please give advise how to resolve this?

Thank you,
Bhanu
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Do an exact search here for "the record is too big to fit in a block".
-craig

"You can never have too many knives" -- Logan Nine Fingers
chandra.shekhar@tcs.com
Premium Member
Premium Member
Posts: 353
Joined: Mon Jan 17, 2011 5:03 am
Location: Mumbai, India

Post by chandra.shekhar@tcs.com »

Increase the value of APT_DEFAULT_TRANSPORT_BLOCK_SIZE.
Thanx and Regards,
ETL User
bond88
Participant
Posts: 109
Joined: Mon Oct 15, 2012 10:05 am
Location: USA

Post by bond88 »

I am no more getting "The record is too big to fit in a block" since I increased the value of APT_DEFAULT_TRANSPORT_BLOCK_SIZE. I am getting below error after loading 3500-3900 rows out of 8000 rows. Please suggest me a way to handle this error.

Oracle_Connector_0,0: Caught exception from runLocally(): APT_BadAlloc: Heap allocation failed..

Thanks,
Bhanu
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Did you try doing an exact search for "APT_BadAlloc: Heap allocation failed"? I got 64 matches including yours and many were marked as resolved... hopefully something in there will be helpful.
-craig

"You can never have too many knives" -- Logan Nine Fingers
Post Reply