My job is running runnig still runnig

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
vasam
Participant
Posts: 30
Joined: Wed Nov 04, 2009 5:06 am

My job is running runnig still runnig

Post by vasam »

Hi All,

I have a job like
I am extacting maximum key value from Teradata(source), i am just deleting recording from ORACLE table whcih are having more than maximum key value.

when i ran this job its running, running.....

Here i am using 2node config file.

could please suggest me why my job still running

Thanks in advance
Vijay
vijayakumargoud
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Typically neverending jobs are causing by blocks or locks on the database side. Try your job writing to a peek stage instead of to Oracle. Does it hang?
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Or run it on a single node.
-craig

"You can never have too many knives" -- Logan Nine Fingers
vasam
Participant
Posts: 30
Joined: Wed Nov 04, 2009 5:06 am

Post by vasam »

Thanks for your Quick Respondings...

I ran the job on Single node also i am facing same problem. job is still running...


Thnaks,
Vijay
vijayakumargoud
priyadarshikunal
Premium Member
Premium Member
Posts: 1735
Joined: Thu Mar 01, 2007 5:44 am
Location: Troy, MI

Post by priyadarshikunal »

Is it a big table?
Is the column in constraint indexed?
how many records getting deleted?
are there duplicate keys being passed to delete?

Deletes are expensive and can take a lot of time to delete huge number of records. get your DBA to monitor for any locks being created on database.

Also tell us more about the job design.
Priyadarshi Kunal

Genius may have its limitations, but stupidity is not thus handicapped. :wink:
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

A single node should remove any locking issues from the table. As noted, deletes are very expensive - how many does the job need to do? What are you seeing in the Monitor?
-craig

"You can never have too many knives" -- Logan Nine Fingers
vasam
Participant
Posts: 30
Joined: Wed Nov 04, 2009 5:06 am

Post by vasam »

My Source table(Teradata) having only 89 records.
Target table(Oracle) having only 29 records which are greater than maximum key value.

As you guys are suggested i have used peek insted of Oracle(Target) that my job ran fine.

my job design
teradata ---> T/R -------> Oracle

Thanks,
Vijay
vijayakumargoud
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

How many records total in the target table? Is the field you are deleting based on indexed? Wondering about full table scans...
-craig

"You can never have too many knives" -- Logan Nine Fingers
Post Reply