Dataset stage: Input-file data set does not exist
Moderators: chulett, rschirm, roy
Dataset stage: Input-file data set does not exist
I have couple of jobs.
Job 1: reads data from Oracle and writes to DB2 & Dataset
Job2: read that dataset and update Oracle tabel
These jobs runs one after another every hour. through scheduler
Problem: After running for 4 hours, job2 is failing with error"
/home/dsadm/Ascential/DataStage/DataSets/eng/Ord: Data set initialization for "/home/dsadm/Ascential/DataStage/DataSets/eng/Ord": This input-file data set does not exist: /home/dsadm/Ascential/DataStage/DataSets/eng/Ord.
Any help appreciated
Job 1: reads data from Oracle and writes to DB2 & Dataset
Job2: read that dataset and update Oracle tabel
These jobs runs one after another every hour. through scheduler
Problem: After running for 4 hours, job2 is failing with error"
/home/dsadm/Ascential/DataStage/DataSets/eng/Ord: Data set initialization for "/home/dsadm/Ascential/DataStage/DataSets/eng/Ord": This input-file data set does not exist: /home/dsadm/Ascential/DataStage/DataSets/eng/Ord.
Any help appreciated
It is difficult, if not impossible, to help with the data given. Your job reads a dataset located at "home/dsadm/Ascential/DataStage/DataSets/eng/Ord" which does not exist. That would lead one to assume that something went wrong with Job 1; or that Job 1 wrote to a dataset, but one located elsewhere on your system.
The jobs are running fine for 4 to 6 hours automatically through schduler. But after that time, job 2 is failing.
Read Oracle tabel and write to DB2. At the same time update a field in Source(Oracle tabel) after succcesful write into DB2.
So JOB1:
Read Oracle tabel and write to DB2 and Dataset
Job2:
read that Dataset and update a field in Oracle tabel.
Read Oracle tabel and write to DB2. At the same time update a field in Source(Oracle tabel) after succcesful write into DB2.
So JOB1:
Read Oracle tabel and write to DB2 and Dataset
Job2:
read that Dataset and update a field in Oracle tabel.
All you really did is repeat yourself, which is not all that helpful.
How exactly does this 'scheduler' run 'one after another'? Without more specifics I'm guessing that at some point one or the other starts taking more than an hour and you get some kind of 'overlap' where they run out of sequence.
![Confused :?](./images/smilies/icon_confused.gif)
How exactly does this 'scheduler' run 'one after another'? Without more specifics I'm guessing that at some point one or the other starts taking more than an hour and you get some kind of 'overlap' where they run out of sequence.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
-
- Participant
- Posts: 9
- Joined: Mon Jun 29, 2009 4:27 pm
-
- Participant
- Posts: 3337
- Joined: Mon Jan 17, 2005 4:49 am
- Location: United Kingdom
-
- Participant
- Posts: 62
- Joined: Sat Mar 07, 2009 4:59 am
- Location: Chicago
- Contact:
Check the job1
Hi
Check first job1 has finished successfully,if it has not finished successfully, then dataset will give error for rest of the jobs, even though it already created for previous runs.if the job is successfull then see whether that is created at some other lcoation, at last go to firstjob manually change the dataset name as sample.ds something like that and run one by one and see the results.
![Embarassed :oops:](./images/smilies/icon_redface.gif)
Check first job1 has finished successfully,if it has not finished successfully, then dataset will give error for rest of the jobs, even though it already created for previous runs.if the job is successfull then see whether that is created at some other lcoation, at last go to firstjob manually change the dataset name as sample.ds something like that and run one by one and see the results.
Suresh Reddy
ETL Developer
Research Operations
"its important to know in which direction we are moving rather than where we are"
ETL Developer
Research Operations
"its important to know in which direction we are moving rather than where we are"