Search found 68 matches

by maheshsada
Tue Mar 28, 2006 5:43 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: UVOpen mkdbfile: cannot create file
Replies: 8
Views: 2365

Hi

Iam showing the error which i got in the director log, there is no log which says delete failed, moreover in the job design for hashfile stage -

we have checked the update action portion -

Clear file before writing and

Create File option is unchecked

Magesh S
by maheshsada
Tue Mar 28, 2006 5:42 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: UVOpen mkdbfile: cannot create file
Replies: 8
Views: 2365

Hi

Iam showing the error which i got in the director log, there is no log which says delete failed, moreover in the job design for hashfile stage -

we have checked the update action portion -

Clear file before writing and

Create File option is unchecked

Magesh S
by maheshsada
Tue Mar 28, 2006 2:14 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: UVOpen mkdbfile: cannot create file
Replies: 8
Views: 2365

no its not giving any other message. The day before it run, it has failed because of space issue in the server, so we deleted some unwanted files and created space. Then we restarted the job and the job got failed with the below error. We reset the job and the error message is DSD.UVOpen Unable to o...
by maheshsada
Mon Mar 27, 2006 11:30 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: UVOpen mkdbfile: cannot create file
Replies: 8
Views: 2365

UVOpen mkdbfile: cannot create file

Hi In my job, selecting data from table(1077070) and writing to hashfile, and this hashfile is used as lookup for other job. This job is running fine for more than 1 year, suddenly its giving an error DSD.UVOpen mkdbfile: cannot create file /opt/etlbatch/hashfiles/HAdcCharge_026 i have checked the s...
by maheshsada
Tue Jan 24, 2006 4:31 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Wierd problem in transformer
Replies: 11
Views: 3661

Hi Since it involves production and thats the reason for me to go for urgent fix, anyhow i have now convinced the user and going for testing First thing iam trying it out was instead of restricting to 1M, iam restricting at 15M Second thing is, removing the transformer and just connecting oracle sta...
by maheshsada
Tue Jan 24, 2006 3:20 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Wierd problem in transformer
Replies: 11
Views: 3661

Hi This is to check whether there is any problem as such in the job. If this gets completed, then we are planning to split the file generation like gettting the max invoice and dividing by a constant value and then giving that in where clause so that simultaneously we can generate multiple files and...
by maheshsada
Mon Jan 23, 2006 12:30 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Wierd problem in transformer
Replies: 11
Views: 3661

Hi Iam running the job by putting a restriction on the number of rows selected (rownum < 1000001 and the job has completed Magesh S Did the job have the same error using a 1=2 constraint or writing to /dev/null? If the error went away, then the source of the problem has to do with the sequential fil...
by maheshsada
Mon Jan 23, 2006 10:51 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Wierd problem in transformer
Replies: 11
Views: 3661

Hi Andrw Its all a straight drag through of column values, but one or two columns are assigned Null values using @NUll, and in one column the data format is derived through a routine. the logic in routine is *DD/MM/YYYY HH24:MI hour = Arg1[12,2] If hour < 12 Then Ans = oconv(Iconv(Arg1[1,10], "...
by maheshsada
Mon Jan 23, 2006 10:36 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Wierd problem in transformer
Replies: 11
Views: 3661

Re: Wierd problem in transformer

Hi this is the ulimit -a Executed command: "ulimit -a" *** Output from command was: *** time(seconds) unlimited file(blocks) unlimited data(kbytes) unlimited stack(kbytes) 8192 coredump(blocks) unlimited nofiles(descriptors) 1024 memory(kbytes) unlimited Magesh S Hi I have reset the job, a...
by maheshsada
Mon Jan 23, 2006 10:22 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Wierd problem in transformer
Replies: 11
Views: 3661

Re: Wierd problem in transformer

Hi I have reset the job, and this is what it has shown From previous run DataStage Job 161 Phantom 20532 Abnormal termination of UniVerse. Fault type is 11. Layer type is BASIC run machine. Fault occurred in BASIC program JOB.1227847453.DT.1376034616.TRANS1 at address e2.kgefec: fatal error 0 We hav...
by maheshsada
Mon Jan 23, 2006 10:01 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Wierd problem in transformer
Replies: 11
Views: 3661

Wierd problem in transformer

Hi I am having a control job, which inturn calls a child job. In the child job there is an oracle stage, transformer stage and sequential file stage. The oracle stage selects around 16316653 rows, and then writes to the sequential file through transformer. But after writing 15564747 rows, the job go...
by maheshsada
Thu Dec 29, 2005 9:37 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: ds_seqput: error in 'write()' - Error 0
Replies: 5
Views: 1185

Hi

Thank you all, the error is due to file size limit in OS, Once Unix admin, changed the setting the job has created a file which is more than 2GB

regards

Magesh S
by maheshsada
Fri Dec 23, 2005 12:47 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: ds_seqput: error in 'write()' - Error 0
Replies: 5
Views: 1185

Hi The error is coming when the file size reaches 2GB. I have checked the directory it has enough free space(around 5GB) and the job has been running successfully for months. I have also checked the file size generated in previous runs which is well below 2GB So this may be the restriction imposed b...
by maheshsada
Fri Dec 23, 2005 10:15 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: ds_seqput: error in 'write()' - Error 0
Replies: 5
Views: 1185

ds_seqput: error in 'write()' - Error 0

Hi Iam getting the following error ds_seqput: error in 'write()' - Error 0 I have checked forum, the explanations were given for DS version 6 and 7, is there any restrictions in the .dat file generation. In my job Iam selecting data from table and generating a dat file I have even checked using ulim...
by maheshsada
Fri Dec 16, 2005 6:37 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: URL parsing in Datastage
Replies: 9
Views: 3523

Hi

I have checked individually the count function its giving valid values (i.e. count of the "/"). When i put the count in the field function its giving the phantom error.

any updates

regards
Magesh S