dataset is not creating when job aborts
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 38
- Joined: Sun Mar 25, 2007 11:05 pm
- Location: chennai
dataset is not creating when job aborts
Hi,
The design of my job is
MQ->Copy stage -> dataset
In my job, datset is serving as a backup . The problem is the dataset created during job run is deleted if job got aborts( We can view the dataset in the given path while job running , but it is missing after it got aborts). Is there any way to commit dataset while writing so that it retain even the job aborts
The design of my job is
MQ->Copy stage -> dataset
In my job, datset is serving as a backup . The problem is the dataset created during job run is deleted if job got aborts( We can view the dataset in the given path while job running , but it is missing after it got aborts). Is there any way to commit dataset while writing so that it retain even the job aborts
thanks & Regards
A.S.Porkalai Lakshmi
A.S.Porkalai Lakshmi
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
When and why did the job abort?
Without knowing that there's no way a cogent answer to your question can be provided.
Please post all relevant error and warning messages.
Without knowing that there's no way a cogent answer to your question can be provided.
Please post all relevant error and warning messages.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Participant
- Posts: 38
- Joined: Sun Mar 25, 2007 11:05 pm
- Location: chennai
In our job design another link from the copy stage is loading teradata tpump table.
As we are doing destructive read from mq, we plan to use the dataset as a backup in case of any failure.
The actual problem is if the job aborts due to any problem for ex. break in teradata connectivity , we are losing the dataset.
Code: Select all
mq -> copy -> dataset
|
V
column import -> transformer -> TPump stage
The actual problem is if the job aborts due to any problem for ex. break in teradata connectivity , we are losing the dataset.
thanks & Regards
A.S.Porkalai Lakshmi
A.S.Porkalai Lakshmi
Using that approach you could turn buffering off for the DataSet stage and get more records, but you cannot guarantee that no data loss will occur. The "Unit of Work" MQ stage will do that for you, or if you do a 2-phase MQ read with transactions, doing a non-destructive read in one job to a table or dataset, then another job doing the destructive read based on records from the table.
The other options are to wait for the DT-Stage to be released (don't hold your breath) or to code your own operator. I'm sitting next to a gent from IBM's advanced consulting group at this minute who is doing exactly that.
The other options are to wait for the DT-Stage to be released (don't hold your breath) or to code your own operator. I'm sitting next to a gent from IBM's advanced consulting group at this minute who is doing exactly that.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Is it legal to have a passive stage (including a Data Set stage) with an input and an output link? Try splitting your job into two jobs - one to write the Data Set, the other to read from it.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Participant
- Posts: 38
- Joined: Sun Mar 25, 2007 11:05 pm
- Location: chennai
I am not reading from the dataset in the same job. another job will read the dataset in case of failure in first job.
my problem is dataset is not created(created at runtime but deleted if job got aborts). Is there any way to prevent dataset from deleting ,even if job aborts
my problem is dataset is not created(created at runtime but deleted if job got aborts). Is there any way to prevent dataset from deleting ,even if job aborts
thanks & Regards
A.S.Porkalai Lakshmi
A.S.Porkalai Lakshmi
-
- Participant
- Posts: 38
- Joined: Sun Mar 25, 2007 11:05 pm
- Location: chennai
sorry ray that is wrong design,
this is correct design,
this is correct design,
Code: Select all
mq -> copy -> dataset
|
V
column import -> transformer -> TPump stage
thanks & Regards
A.S.Porkalai Lakshmi
A.S.Porkalai Lakshmi
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact: