Simulating an error
Moderators: chulett, rschirm, roy
Simulating an error
I have a sequencer and I would like to simulate the raising of an error. That is, I would like to use a sequencer which would abort half way through so that I can restart from that point. I would assume that I could use DSLogFatal in a after-job routine. Based on a job status or user status parameter, when I retstart, the same after-job subroutine would not be executed. I would think that this is a common requirement for debugging error handling.
Any ideas would be appreciated.
Any ideas would be appreciated.
Hello!
Well, you 'could' test the concept of restartability with the logic you've thought of.
When you run the sequence for the first time with lets say jobX having the after job subroutine, all jobs upto JobX would execute successfully.
JobX would abort and cause the sequence to be aborted too and would put it in the aborted/restartable state.
Now when you would rerun the entire sequence without compiling it etc ... it would skip all jobs/sequences upto JobX and then try and run JobX after reseting it- since it was in an aborted state previosuly.
Considering your after stage subroutine would still be there ... it would abort again.
However you would be able to verify the point of restart through your director log.
BTW - if you are using After/Before job subroutine merely harciding the return value to be non zero would cause the job to abort. you dont need to explicitly make a call to DSAbortToLog.
There are simpler ways of testing it out depending on what kind of jobs you have . I would for instance rename a source file a job would be expecting on the unix box so that the job may abort . Rename the file back to its original name and rerun the sequence ...
Make sure you have chosen
>Add checkpoints so sequence is restartable , and if necessary the
>Automatically handle jobs that fail
in the properties dialog for the sequences.
hope this helps
Well, you 'could' test the concept of restartability with the logic you've thought of.
When you run the sequence for the first time with lets say jobX having the after job subroutine, all jobs upto JobX would execute successfully.
JobX would abort and cause the sequence to be aborted too and would put it in the aborted/restartable state.
Now when you would rerun the entire sequence without compiling it etc ... it would skip all jobs/sequences upto JobX and then try and run JobX after reseting it- since it was in an aborted state previosuly.
Considering your after stage subroutine would still be there ... it would abort again.
However you would be able to verify the point of restart through your director log.
BTW - if you are using After/Before job subroutine merely harciding the return value to be non zero would cause the job to abort. you dont need to explicitly make a call to DSAbortToLog.
There are simpler ways of testing it out depending on what kind of jobs you have . I would for instance rename a source file a job would be expecting on the unix box so that the job may abort . Rename the file back to its original name and rerun the sequence ...
Make sure you have chosen
>Add checkpoints so sequence is restartable , and if necessary the
>Automatically handle jobs that fail
in the properties dialog for the sequences.
hope this helps
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
You've probably already noticed that job sequences do not support before/after subroutines. You will need to use Routine activities within the job sequence itself - the routines themselves will need to be created as transform functions, though these can be interludes to the requisite before/after subroutines.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
ray and dwblore, thank you for your responses.
ray, when you say that job sequences do not support before/after sub routines, I guess you mean that, job sequences with Job Activity stages with jobs which have before/after sub routines, will not work, right?
Can you explain what you mean by "transform functions"? Does that mean that they have to be developed in C for PX jobs?
Thanks.
ray, when you say that job sequences do not support before/after sub routines, I guess you mean that, job sequences with Job Activity stages with jobs which have before/after sub routines, will not work, right?
Can you explain what you mean by "transform functions"? Does that mean that they have to be developed in C for PX jobs?
Thanks.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Transform functions are developed in DataStage BASIC (as "server routines" whose type is "transform function"). They can be called from Routine activities in job sequences.
Job activities can (and do) successfully execute jobs where those jobs have before/after subroutines. What I said was that you do not find before/after subroutines called directly by job sequences.
Job activities can (and do) successfully execute jobs where those jobs have before/after subroutines. What I said was that you do not find before/after subroutines called directly by job sequences.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
So just to clarify, we have the following types of routines in PX:
1)Custom routines:
a)Could be user defined
b)CANNOT be DS provided.
The code has to be developed in C/C++ and called from DS. These CAN be called using the Routine Activity stage in a job sequencer.
2)Transform functions:
a)Could be user defined
b)Could be DS provided
These are functions that primarily do some kind of transformation on data. These CAN be called using the Routine Activity stage in a job sequencer.
3)Before/After subroutines:
a)Could be user defined
b)Could be DS provided
These are generic routines that can be called before a job or after a job in a Job Activity stage in a job sequencer. However, they CANNOT be called using the Routine Activity stage in a job sequencer.
(There are 2 other types of routines, Custom Universe Functions and ActiveX functions, but I am not going to bother with them at this time. )
I would appreciate it if you could let me know if my assessment is correct?
Thanks.
1)Custom routines:
a)Could be user defined
b)CANNOT be DS provided.
The code has to be developed in C/C++ and called from DS. These CAN be called using the Routine Activity stage in a job sequencer.
2)Transform functions:
a)Could be user defined
b)Could be DS provided
These are functions that primarily do some kind of transformation on data. These CAN be called using the Routine Activity stage in a job sequencer.
3)Before/After subroutines:
a)Could be user defined
b)Could be DS provided
These are generic routines that can be called before a job or after a job in a Job Activity stage in a job sequencer. However, they CANNOT be called using the Routine Activity stage in a job sequencer.
(There are 2 other types of routines, Custom Universe Functions and ActiveX functions, but I am not going to bother with them at this time. )
I would appreciate it if you could let me know if my assessment is correct?
Thanks.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
1(b) - actually they could be DS provided, they just aren't.
There is example code provided elsewhere in the product. The reason is that you have to create, compile and link the C++ routine externally to DataStage; what goes in the Repository is only a description of the routine, not the routine itself.
There is example code provided elsewhere in the product. The reason is that you have to create, compile and link the C++ routine externally to DataStage; what goes in the Repository is only a description of the routine, not the routine itself.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Craig, thank you for your response. I went to the post. These are the steps mentioned:
In the designer:
1. right mouse click on the Pallette text at the top of the pallette box
2. choose "customize Pallette"
3. In the top-left Repository window, select Parallel -> Processing -> Basic Transformer
4. after activating, click on the right arrow to add it to your pallette.
After Setp 2, I see a pallette with 3 things:
Respository Items (top left), Current Pallette groups and shortcut items(right), Default Pallette groups and shortcut items (bottom left)
Nowhere I see Parallel -> Processing -> Basic Transformer.
In the designer:
1. right mouse click on the Pallette text at the top of the pallette box
2. choose "customize Pallette"
3. In the top-left Repository window, select Parallel -> Processing -> Basic Transformer
4. after activating, click on the right arrow to add it to your pallette.
After Setp 2, I see a pallette with 3 things:
Respository Items (top left), Current Pallette groups and shortcut items(right), Default Pallette groups and shortcut items (bottom left)
Nowhere I see Parallel -> Processing -> Basic Transformer.
-
- Charter Member
- Posts: 822
- Joined: Sat Sep 17, 2005 5:25 pm
- Location: USA