Hi
Iam showing the error which i got in the director log, there is no log which says delete failed, moreover in the job design for hashfile stage -
we have checked the update action portion -
Clear file before writing and
Create File option is unchecked
Magesh S
Search found 68 matches
- Tue Mar 28, 2006 5:43 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: UVOpen mkdbfile: cannot create file
- Replies: 8
- Views: 2365
- Tue Mar 28, 2006 5:42 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: UVOpen mkdbfile: cannot create file
- Replies: 8
- Views: 2365
- Tue Mar 28, 2006 2:14 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: UVOpen mkdbfile: cannot create file
- Replies: 8
- Views: 2365
no its not giving any other message. The day before it run, it has failed because of space issue in the server, so we deleted some unwanted files and created space. Then we restarted the job and the job got failed with the below error. We reset the job and the error message is DSD.UVOpen Unable to o...
- Mon Mar 27, 2006 11:30 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: UVOpen mkdbfile: cannot create file
- Replies: 8
- Views: 2365
UVOpen mkdbfile: cannot create file
Hi In my job, selecting data from table(1077070) and writing to hashfile, and this hashfile is used as lookup for other job. This job is running fine for more than 1 year, suddenly its giving an error DSD.UVOpen mkdbfile: cannot create file /opt/etlbatch/hashfiles/HAdcCharge_026 i have checked the s...
- Tue Jan 24, 2006 4:31 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Wierd problem in transformer
- Replies: 11
- Views: 3661
Hi Since it involves production and thats the reason for me to go for urgent fix, anyhow i have now convinced the user and going for testing First thing iam trying it out was instead of restricting to 1M, iam restricting at 15M Second thing is, removing the transformer and just connecting oracle sta...
- Tue Jan 24, 2006 3:20 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Wierd problem in transformer
- Replies: 11
- Views: 3661
Hi This is to check whether there is any problem as such in the job. If this gets completed, then we are planning to split the file generation like gettting the max invoice and dividing by a constant value and then giving that in where clause so that simultaneously we can generate multiple files and...
- Mon Jan 23, 2006 12:30 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Wierd problem in transformer
- Replies: 11
- Views: 3661
Hi Iam running the job by putting a restriction on the number of rows selected (rownum < 1000001 and the job has completed Magesh S Did the job have the same error using a 1=2 constraint or writing to /dev/null? If the error went away, then the source of the problem has to do with the sequential fil...
- Mon Jan 23, 2006 10:51 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Wierd problem in transformer
- Replies: 11
- Views: 3661
- Mon Jan 23, 2006 10:36 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Wierd problem in transformer
- Replies: 11
- Views: 3661
Re: Wierd problem in transformer
Hi this is the ulimit -a Executed command: "ulimit -a" *** Output from command was: *** time(seconds) unlimited file(blocks) unlimited data(kbytes) unlimited stack(kbytes) 8192 coredump(blocks) unlimited nofiles(descriptors) 1024 memory(kbytes) unlimited Magesh S Hi I have reset the job, a...
- Mon Jan 23, 2006 10:22 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Wierd problem in transformer
- Replies: 11
- Views: 3661
Re: Wierd problem in transformer
Hi I have reset the job, and this is what it has shown From previous run DataStage Job 161 Phantom 20532 Abnormal termination of UniVerse. Fault type is 11. Layer type is BASIC run machine. Fault occurred in BASIC program JOB.1227847453.DT.1376034616.TRANS1 at address e2.kgefec: fatal error 0 We hav...
- Mon Jan 23, 2006 10:01 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Wierd problem in transformer
- Replies: 11
- Views: 3661
Wierd problem in transformer
Hi I am having a control job, which inturn calls a child job. In the child job there is an oracle stage, transformer stage and sequential file stage. The oracle stage selects around 16316653 rows, and then writes to the sequential file through transformer. But after writing 15564747 rows, the job go...
- Thu Dec 29, 2005 9:37 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: ds_seqput: error in 'write()' - Error 0
- Replies: 5
- Views: 1185
- Fri Dec 23, 2005 12:47 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: ds_seqput: error in 'write()' - Error 0
- Replies: 5
- Views: 1185
Hi The error is coming when the file size reaches 2GB. I have checked the directory it has enough free space(around 5GB) and the job has been running successfully for months. I have also checked the file size generated in previous runs which is well below 2GB So this may be the restriction imposed b...
- Fri Dec 23, 2005 10:15 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: ds_seqput: error in 'write()' - Error 0
- Replies: 5
- Views: 1185
ds_seqput: error in 'write()' - Error 0
Hi Iam getting the following error ds_seqput: error in 'write()' - Error 0 I have checked forum, the explanations were given for DS version 6 and 7, is there any restrictions in the .dat file generation. In my job Iam selecting data from table and generating a dat file I have even checked using ulim...
- Fri Dec 16, 2005 6:37 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: URL parsing in Datastage
- Replies: 9
- Views: 3523