Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.
Moderators: chulett , rschirm , roy
abhilashnair
Participant
Posts: 284 Joined: Fri Oct 13, 2006 4:31 am
Post
by abhilashnair » Mon Mar 28, 2011 4:38 am
When a SIGSEGV error goes away when we change the configuration file from multiple node to single node, does this mean that the Partitioning is the culprit in this case. I have a job which fetches from ODBC and inserts into ODBC stage. The mode is Upsert(Update Then Insert). The job fails with SIGSEGV on 4 nodes and works fine on single node
priyadharsini
Participant
Posts: 40 Joined: Mon May 11, 2009 12:19 am
Location: Madurai
Post
by priyadharsini » Mon Mar 28, 2011 5:33 am
what is the partition defined on ODBC stage?
abhilashnair
Participant
Posts: 284 Joined: Fri Oct 13, 2006 4:31 am
Post
by abhilashnair » Mon Mar 28, 2011 6:52 am
priyadharsini wrote: what is the partition defined on ODBC stage?
Auto
chulett
Charter Member
Posts: 43085 Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO
Post
by chulett » Mon Mar 28, 2011 7:08 am
Try hash partitioning on the key field(s) used in the upsert.
-craig
"You can never have too many knives" -- Logan Nine Fingers
abhilashnair
Participant
Posts: 284 Joined: Fri Oct 13, 2006 4:31 am
Post
by abhilashnair » Mon Mar 28, 2011 11:31 pm
chulett wrote: Try hash partitioning on the key field(s) used in the upsert. ...
Same Result even after Hash Partitioning. SIGSEGV error on target ODBC.
I have disabled Operator combination
ray.wurlod
Participant
Posts: 54607 Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:
Post
by ray.wurlod » Tue Mar 29, 2011 2:00 am
How big (in bytes) are your rows?
Is the product of array size and row size too large to fit in memory?
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
abhilashnair
Participant
Posts: 284 Joined: Fri Oct 13, 2006 4:31 am
Post
by abhilashnair » Tue Mar 29, 2011 3:44 am
ray.wurlod wrote: How big (in bytes) are your rows?
Is the product of array size and row size too large to fit in memory? ...
Array Size specified in Target is 2000. As for the Row Size, since this is a delete operation, The metadata of the target only has the key column. the Mode is Upsert/Delete Only
chulett
Charter Member
Posts: 43085 Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO
Post
by chulett » Tue Mar 29, 2011 7:07 am
abhilashnair wrote: Array Size specified in Target is 2000.
As a test, lower it - even dropping it down to 1 to see if that makes any difference.
-craig
"You can never have too many knives" -- Logan Nine Fingers
abhilashnair
Participant
Posts: 284 Joined: Fri Oct 13, 2006 4:31 am
Post
by abhilashnair » Fri Apr 01, 2011 4:39 am
Dropped it to 1, it worked....too slow although...is this the only way ?
chulett
Charter Member
Posts: 43085 Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO
Post
by chulett » Fri Apr 01, 2011 6:47 am
Did you try
any other values, or just 1?
-craig
"You can never have too many knives" -- Logan Nine Fingers
abhilashnair
Participant
Posts: 284 Joined: Fri Oct 13, 2006 4:31 am
Post
by abhilashnair » Fri Apr 01, 2011 10:38 pm
Looks like, I need to go for trial and error here. Not sure, is there any other way to calculate exactly what should I mention in the Insert Array Size
chulett
Charter Member
Posts: 43085 Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO
Post
by chulett » Fri Apr 01, 2011 11:02 pm
Nope, you hunt for the sweet spot - go up until it blows, then back it down a notch.
-craig
"You can never have too many knives" -- Logan Nine Fingers