Problem with VOC after database Migration.

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
Anupam_M
Premium Member
Premium Member
Posts: 13
Joined: Tue Aug 01, 2006 5:18 am
Location: London

Problem with VOC after database Migration.

Post by Anupam_M »

Hi DSGurus,
I am facing this unusual problem. My source and target databases have been migrated from Oracle 9i to Oracle 10g. The datastage jobs used to write/read data from dynamic hash files.However when I run the jobs which are creating these hash files(the files are being deleted every time and recreated) then I get the following error:
"<Filename>" is already in your VOC file as a file definition record.
This as far as I understand is that the synonym is in the VOC already exists and hence the recreation is not getting possible.Prior to the migration this was not a problem.
I have tried to run the job by changing the hash files names and they work fine.
However given the huge number of Hash files that are being created I do not think this is a feasible solution.
Could someone help on this.
Regards,
Anupam
Regards,
Anupam Mukherjee.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Welcome Aboard. :D

How did you accomplish this 'database migration'? Specifically, what changes did you make to your DataStage environment? I ask because the two problems should in no way be related. :?
-craig

"You can never have too many knives" -- Logan Nine Fingers
Anupam_M
Premium Member
Premium Member
Posts: 13
Joined: Tue Aug 01, 2006 5:18 am
Location: London

Post by Anupam_M »

Hi Craig
Thanks for your reply.There were no specific changes to the Datastage Environment that was carried out. I am not very sure of the process the database migration was conducted.
However there are a few more observations:
1. the hash files are being deleted and recreated evertime the job is run.
So possibly the job is deleting the hash file but not deleting the vocabulary(VOC) entry and hence when it is trying to recreate the has file it is throwing the error.(Just an intrusion)
After the job aborts I try checking out the hash file and it is there and looks quite perfect!!!!!!!
Your futher feedback/suggestions will be of great help :D
Regards,
Anupam Mukherjee.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Hmmm... ok. Don't worry about how the database migration was done, that's a DBA thing and they like to keep that kind of arcane knowledge secret. :wink: Just wanted to see if you did something specific with DataStage as well as part of that. Shouldn't have needed to.

Are there any other messages in the job's log when you have this problem? Anything about a T30FILE setting, perhaps? I've seen this happen when certain uvconfig parameters need to be bumped up from their 'out of the box' settings, but they are generally accompanied by specific messages to that effect.

When the job aborts, have you tried Resetting it from the Director? If so, does it add a new log entry entitled 'From previous run...'?
-craig

"You can never have too many knives" -- Logan Nine Fingers
Anupam_M
Premium Member
Premium Member
Posts: 13
Joined: Tue Aug 01, 2006 5:18 am
Location: London

Post by Anupam_M »

Hi Craig,
This was the toal error message that I got:
DSD.UVOpen rm: Unable to remove directory <Filename>: File exists
Unable to DELETE file "<Filename>".
"LocalRefs" is already in your VOC file as a file definition record.
File name =
File not created.

There was no mention about a a T30FILE setting.
I had already tried resetting the job and running it but it does not give "From previous run.." entry.
Regards,
Anupam Mukherjee.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

No, this is just DELETE.FILE failing because of entries pre-existing in the VOC file. The only safe course is to create a list of the failures, delete the pertinent entries from the file system and from the VOC. For example, if a hashed file is called MYFILE, then you need three steps, and you can ignore any "not found" errors that may occur.
Possibly easiest is to create the three commands as follows in the Administrator, then select all three and save as a pre-stored procedure called, say, CLEANUP.

Code: Select all

SH -c "rm -rf <<F(VOC,<<I2,HashedFile>>,3),Dict pathname>>"
SH -c "rm -rf <<F(VOC,<<I2,HashedFile>>,2),Data pathname>>"
DELETE VOC "<<I2,HashedFile>>"
For those on Windows servers, replace "SH -c" with "DOS /C" and "rm -rf" with "DEL /S/Q"
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

ray.wurlod wrote:No, this is just DELETE.FILE failing because of entries pre-existing in the VOC file.
Granted, but trying to get to the root cause. Before that, something caused the delete of the VOC record to fail causing this error on subsequent runs... yes? :?
-craig

"You can never have too many knives" -- Logan Nine Fingers
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

The root cause is that DELETE.FILE does not delete the physical objects if they are referred to by a pathname rather than a simple entryname. This behaviour is documented; either in the UniVerse manuals or by typing HELP DELETE.FILE from the TCL prompt in a telnet session (HELP is blocked from the Administrator client).

So DELETE.FILE deletes the VOC entry but not the data and dictionary portions of the hashed file. My paragraph, posted earlier, deletes all three.

It does, however, require that the VOC pointer exists, so you may wish to include or execute an appropriate SETFILE command beforehand.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply