Hi,
All these days, my search has returned answers for my questions in dsxchange. This time I found many questions of this type, but not proper solution.
My server job is Production and been running for quite some time. Today it started giving me this one liner "Attempting to Cleanup after ABORT raised in stage PLA_Src_Lod" error. I tried resetting, recompiling. Nothing is working out.
Your help will be highly appreciated as it has been.
One line error
Moderators: chulett, rschirm, roy
Hey, 2 years and this is your first post. Sounds like the forum Search facility works well!!
Can you describe the job design, what kinds of stages, how they're used? If you're getting an abnormal termination, it could be the result of a math operation on a NULL value. We'll need to try to help pinpoint the issue.
Can you describe the job design, what kinds of stages, how they're used? If you're getting an abnormal termination, it could be the result of a math operation on a NULL value. We'll need to try to help pinpoint the issue.
Kenneth Bland
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Thrilled to see quick responses.
Thrilled to see quick responses.kcbland wrote:Hey, 2 years and this is your first post. Sounds like the forum Search facility works well!!
Can you describe the job design, what kinds of stages, how they're used? If you're getting an abnormal termination, it could be the result of a math operation on a NULL value. We'll need to try to help pinpoint the issue.
Here is my Job design.
Sequencial file -----> xfm----------> Teradata Mload Stage
This job is used to load 1 mil rows into a teradata table.
I am not seeing any problem from Teradata side as it is not locking the target table.
This Job has been running fine for the past six months. All the nulls are handled in the xfm properly.
Rm
Without any error mesasges, even when resetting, you're left with having to hunt down the bug.
Check to see if any before/after job/stage routines are being called. If so, make sure they are still compiled. Opening a Routine and changing it without compiling it removes the compiled code. Jobs mysteriously die because the CALL for the Routine can't be handled.
Can you copy the job and replace the Teradata stage with a Sequential stage? If you run that job and it works, then it means that the data is operable by the functions and computations within the transformer. That points the finger to the Teradata stage.
If the job copy fails, then you're going to have to figure out which computation is causing the job to abort. The easiest way to figure that one out is to look at any columns that aren't straight mappings. If they're being mathematically computed, that's where NULL values get into the mix. If NULLs aren't your problem, it would be probably something in a custom DS function that maybe wrong.
Again, a DS function just like a routine is subject to losing its compiled code if saved but not compiled. Mysterious things happen because the job already understands there's supposed to be a function out there, but not finding it causes an abnormal termination.
Good luck.
Check to see if any before/after job/stage routines are being called. If so, make sure they are still compiled. Opening a Routine and changing it without compiling it removes the compiled code. Jobs mysteriously die because the CALL for the Routine can't be handled.
Can you copy the job and replace the Teradata stage with a Sequential stage? If you run that job and it works, then it means that the data is operable by the functions and computations within the transformer. That points the finger to the Teradata stage.
If the job copy fails, then you're going to have to figure out which computation is causing the job to abort. The easiest way to figure that one out is to look at any columns that aren't straight mappings. If they're being mathematically computed, that's where NULL values get into the mix. If NULLs aren't your problem, it would be probably something in a custom DS function that maybe wrong.
Again, a DS function just like a routine is subject to losing its compiled code if saved but not compiled. Mysterious things happen because the job already understands there's supposed to be a function out there, but not finding it causes an abnormal termination.
Good luck.
Kenneth Bland
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Thank you
Thank you guys, We found the root cause. It was the space problem. We cleared some space and every thing started working fine.kcbland wrote:Without any error mesasges, even when resetting, you're left with having to hunt down the bug.
Check to see if any before/after job/stage routines are being called. If so, make sure they are still compiled. Opening a Routine and changing it without compiling it removes the compiled code. Jobs mysteriously die because the CALL for the Routine can't be handled.
Can you copy the job and replace the Teradata stage with a Sequential stage? If you run that job and it works, then it means that the data is operable by the functions and computations within the transformer. That points the finger to the Teradata stage.
If the job copy fails, then you're going to have to figure out which computation is causing the job to abort. The easiest way to figure that one out is to look at any columns that aren't straight mappings. If they're being mathematically computed, that's where NULL values get into the mix. If NULLs aren't your problem, it would be probably something in a custom DS function that maybe wrong.
Again, a DS function just like a routine is subject to losing its compiled code if saved but not compiled. Mysterious things happen because the job already understands there's supposed to be a function out there, but not finding it causes an abnormal termination.
Good luck.
If DataStage log could give more specific error description, it would be much more helpful.
Rm
-
- Premium Member
- Posts: 892
- Joined: Thu Oct 16, 2003 5:18 am