Search found 55 matches
- Thu Mar 08, 2007 1:16 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Job Abort After 50 warnings
- Replies: 13
- Views: 6689
Hi Craig, I had to add a job paramter DSJ_LIMITWARN and give a default value of (0) Zero. so that 0 sets the warning limits to NO LIMIT. this was recommended by IBM support. I will let you know the outcome when i encounter this problem again. because my jobs gets aborted only sometimes. thank you al...
- Thu Mar 08, 2007 8:37 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Job Abort After 50 warnings
- Replies: 13
- Views: 6689
- Thu Mar 08, 2007 8:34 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Job Abort After 50 warnings
- Replies: 13
- Views: 6689
Hi chulett then why do i have this option in the director so that i am able to set the warning limit to NO LIMIT in the director. sometimes my job just runs fine with more than 50 warnings when i run from the director. sometimes it gets aborted when it sees 50 warnings. can you please explain me . i...
- Thu Mar 08, 2007 8:18 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Job Abort After 50 warnings
- Replies: 13
- Views: 6689
- Thu Mar 08, 2007 8:14 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Job Abort After 50 warnings
- Replies: 13
- Views: 6689
Job Abort After 50 warnings
Hi Everyone,
I have a job which gets aborted after 50 warnings. the warnings limit is set to unlimited. but still this job aborts sometimes and sometimes it runs just fine. Can there be any other reason for such thing to happen.
thanks
I have a job which gets aborted after 50 warnings. the warnings limit is set to unlimited. but still this job aborts sometimes and sometimes it runs just fine. Can there be any other reason for such thing to happen.
thanks
- Wed Mar 07, 2007 2:23 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: New features related to automatic error handling in Sequence
- Replies: 4
- Views: 1182
- Wed Mar 07, 2007 1:28 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: New features related to automatic error handling in Sequence
- Replies: 4
- Views: 1182
- Wed Mar 07, 2007 1:19 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Problem with execute_command activity
- Replies: 32
- Views: 11741
shettar, i had a similar situation. i had to remeove the files from a sepcific path and i using rm -rf #$jpjobParameter#. ( i was passing this #$jpjobparameter# in the paramter , which is beneath the command ) this scenario did not work for me then , i tired to the entire thing rm -rf /apps/..../......
- Wed Mar 07, 2007 1:11 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: default length of a string in job parameter
- Replies: 2
- Views: 721
- Tue Feb 27, 2007 10:18 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Bad Data / Record Capture
- Replies: 3
- Views: 965
Hi guru, thanks for your reply. my sources are flat files and the fatal error is Failure during execution of operator logic. the same jobs runs fine all the tine , except in some scenarios which i am not able to figure out what is causing this job abort. i am sure that there is some bad data coming ...
- Tue Feb 27, 2007 10:03 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Bad Data / Record Capture
- Replies: 3
- Views: 965
Bad Data / Record Capture
Hi Everyone,
One of my jobs is getting aborted when i run due to some bad data issues. is there any way in datastage to capture the Record/Data when the job gets aborted . so that i can look at the Record/Data and work further on what is causing the problem.
Thanks
sudharma
One of my jobs is getting aborted when i run due to some bad data issues. is there any way in datastage to capture the Record/Data when the job gets aborted . so that i can look at the Record/Data and work further on what is causing the problem.
Thanks
sudharma
- Fri Feb 23, 2007 2:24 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: SEQUENCE ISSUE
- Replies: 3
- Views: 1057
- Fri Feb 23, 2007 1:20 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: SEQUENCE ISSUE
- Replies: 3
- Views: 1057
SEQUENCE ISSUE
Hi Everyone , I have to look for a file XYZ.Txt in one particular path and I have to check and see if no files exists in another path and if both of these conditions satisfy ( i have to see the file XYZ.Txt and i have to ensure that no files exist in another path )then i have to stop my job from run...
- Fri Feb 23, 2007 12:37 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Job Compilation Error
- Replies: 1
- Views: 812
Job Compilation Error
Hi Everyone, I get this error when i run my sequence. (Error Compiling Job control SubRoutine) when i click on the MORE option, i see this warning message Compiling: Source = 'RT_BP266/JOB.679453303.DT.1429939518', Object = 'RT_BP266.O/JOB.679453303.DT.1429939518' ***********************************...
- Thu Feb 22, 2007 4:10 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Ececute Command - Parameter
- Replies: 3
- Views: 1254
Ececute Command - Parameter
Hi Everyone, I have a Execute command activity which remeoves a file from a path rm -rf /path name/ can i pass a parameter in place of rm -rf/pathname/ i tried to pass a parameter but i get an error that i cant pass a parameter in an executable path. can anyone suggest me , if there are any other po...