SIGSEGV and Partitioning

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
abhilashnair
Participant
Posts: 284
Joined: Fri Oct 13, 2006 4:31 am

SIGSEGV and Partitioning

Post by abhilashnair »

When a SIGSEGV error goes away when we change the configuration file from multiple node to single node, does this mean that the Partitioning is the culprit in this case. I have a job which fetches from ODBC and inserts into ODBC stage. The mode is Upsert(Update Then Insert). The job fails with SIGSEGV on 4 nodes and works fine on single node
priyadharsini
Participant
Posts: 40
Joined: Mon May 11, 2009 12:19 am
Location: Madurai

Post by priyadharsini »

what is the partition defined on ODBC stage?
abhilashnair
Participant
Posts: 284
Joined: Fri Oct 13, 2006 4:31 am

Post by abhilashnair »

priyadharsini wrote:what is the partition defined on ODBC stage?
Auto
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Try hash partitioning on the key field(s) used in the upsert.
-craig

"You can never have too many knives" -- Logan Nine Fingers
abhilashnair
Participant
Posts: 284
Joined: Fri Oct 13, 2006 4:31 am

Post by abhilashnair »

chulett wrote:Try hash partitioning on the key field(s) used in the upsert. ...
Same Result even after Hash Partitioning. SIGSEGV error on target ODBC.
I have disabled Operator combination
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

How big (in bytes) are your rows?

Is the product of array size and row size too large to fit in memory?
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
abhilashnair
Participant
Posts: 284
Joined: Fri Oct 13, 2006 4:31 am

Post by abhilashnair »

ray.wurlod wrote:How big (in bytes) are your rows?

Is the product of array size and row size too large to fit in memory? ...
Array Size specified in Target is 2000. As for the Row Size, since this is a delete operation, The metadata of the target only has the key column. the Mode is Upsert/Delete Only
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

abhilashnair wrote:Array Size specified in Target is 2000.
As a test, lower it - even dropping it down to 1 to see if that makes any difference.
-craig

"You can never have too many knives" -- Logan Nine Fingers
abhilashnair
Participant
Posts: 284
Joined: Fri Oct 13, 2006 4:31 am

Post by abhilashnair »

Dropped it to 1, it worked....too slow although...is this the only way ?
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Did you try any other values, or just 1? :?
-craig

"You can never have too many knives" -- Logan Nine Fingers
abhilashnair
Participant
Posts: 284
Joined: Fri Oct 13, 2006 4:31 am

Post by abhilashnair »

Looks like, I need to go for trial and error here. Not sure, is there any other way to calculate exactly what should I mention in the Insert Array Size
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Nope, you hunt for the sweet spot - go up until it blows, then back it down a notch.
-craig

"You can never have too many knives" -- Logan Nine Fingers
Post Reply