help: before job subroutine dealing wtih hash file

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
idocrm
Participant
Posts: 17
Joined: Wed Jul 23, 2003 9:41 pm

help: before job subroutine dealing wtih hash file

Post by idocrm »

Gurus,
Can you please tell me what are some of the functions available
to open a hash file (with path), set column value for a particular key? (I think I can do this by developing a job, but I want to see if I can do this by using subroutine)
What I have is a hash file shared by several jobs. Since they run in different time schedule, I can't check the 'clear file before writing' option under 'Update action'.
Say my hash file columns look like the following:
key|count
and I have these entries
"one"|123
"two"|456
I want to write a subroutine to accept a 'key' parameter and hash file name (with path) so that I can set its count to an initial value.
My subroutine will be something like InitHashPerKey(hashFielName,key)
Any ideas are appreciated.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Use OpenPath statement to open the hashed file, ReadU or RecordLockU to lock the record for update, and Write to update (overwrite) the record.
Find all these statements in on-line help or in the BASIC manual.

Alternately, generate an SQL statement and execute it via DSExecute() function.
Shell = "UV"
Command = "INSERT INTO table (@ID,F1) VALUES ('one','123');"
Output = ""
Code = 0
Call DSExecute(Shell, Command, Output, Code)
If Code 0
Then
Call DSLogWarn("Error in hashed write.","MyRoutine")
End

Edited by - ray.wurlod on 01/02/2003 15:41:44
Post Reply