Update query hanging while running in parallel

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Sainath.Srinivasan
Participant
Posts: 3337
Joined: Mon Jan 17, 2005 4:49 am
Location: United Kingdom

Post by Sainath.Srinivasan »

First you need to deduce the reason for this issue before knowing how to avoid it.

Your DBA will be able to trace the session with SQLs and values.

Try running the same in single node - with no change in data.
harshada
Premium Member
Premium Member
Posts: 92
Joined: Tue May 29, 2007 8:40 am

Post by harshada »

I am not a premier member :( ...So can't read all of your comment..But I tried running it with single node by changing the .apt file..and it worked...
I want to understand how only that job will be configured to run on single node..cause when I change the .apt file it is appliclable for the whole project..Also if the job needs to be run on multiple nodes, if there any ways to avoid deadlocks ...
priyadarshikunal
Premium Member
Premium Member
Posts: 1735
Joined: Thu Mar 01, 2007 5:44 am
Location: Troy, MI

Post by priyadarshikunal »

before you turn towards using it with single node try:

1. Check if the input has duplicates on the keys defined in where clause. ( a remove duplicate stage should give you the stats to find if there are duplicates)
2. Check trace file to see if its row level lock or table level lock.
3. Hash partition before the oracle stage on the keys defined in where clause or just one column (most dominant).

If all above mentioned point fails the workaround will be:
1. call $APT_CONFIG_FILE variable to the job and change the file name to one node config file.
or
2. Go to oracle stage properties then Stage -> Advanced and click on the selection button of node map constraint then select a node to run that stage on one node.

But still I think you should try to fix the problem instead of workaround.
Priyadarshi Kunal

Genius may have its limitations, but stupidity is not thus handicapped. :wink:
raj_prodigy
Premium Member
Premium Member
Posts: 4
Joined: Mon Dec 25, 2006 3:17 am
Location: India

Post by raj_prodigy »

Hi Priyadarshi Kunal,

The work around is fine if the job runs in single node. What if the user needs to run the job in multiple nodes?

Please clarify my doubt.

Thanks,
Bhasan.
Thanks,
Raj
harshada
Premium Member
Premium Member
Posts: 92
Joined: Tue May 29, 2007 8:40 am

Post by harshada »

Hi Priyadarshi

Thanks for your suggestions.
In the advanced tab I can't see any option to define node. All I can see is buffering options. I am using DS 7.5.1 enterprise edition.
priyadarshikunal
Premium Member
Premium Member
Posts: 1735
Joined: Thu Mar 01, 2007 5:44 am
Location: Troy, MI

Post by priyadarshikunal »

harshada wrote: In the advanced tab I can't see any option to define node. All I can see is buffering options.
select stage from the main tabs. not the input tab.
raj_prodigy wrote: The work around is fine if the job runs in single node. What if the user needs to run the job in multiple nodes?
The problem is "the user cannot run it on multiple node because its causing locks". so thats the workaround.

still I think the constraints in where clause are not enough to distinctly identify a records and lock that only. OR There are duplicates in data and hence both input records are locking the same record or set of records.

So my suggestion is to find the problem and remove it instead of taking the other route to escape.

workaround should always be the last escape.
Priyadarshi Kunal

Genius may have its limitations, but stupidity is not thus handicapped. :wink:
harshada
Premium Member
Premium Member
Posts: 92
Joined: Tue May 29, 2007 8:40 am

Post by harshada »

I have checked the key value and it is unique.
So I tried this :-
Go to oracle stage properties then Stage -> Advanced and click on the selection button of node map constraint then select a node to run that stage on one node
as suggested by priyadarshi and it worked.Job completed successfully :).
Thanks guys for all your help and suggestions.
Post Reply