How to Clear a Hashed File Outside of Datastage Job?

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

kduke
Charter Member
Charter Member
Posts: 5227
Joined: Thu May 29, 2003 9:47 am
Location: Dallas, TX
Contact:

Post by kduke »

If the last 2 lines are not in the log then you are failing hard meaning you are probably core dumping. Your hash file may be corrupt or your filepath is bad or you need to quote it properly. If there is a space in this filepath then you could be clearing a directory or something other than what you really want to clear.
Mamu Kim
leomauer
Premium Member
Premium Member
Posts: 100
Joined: Mon Nov 03, 2003 1:33 pm

Post by leomauer »

chulett wrote:So, you have no idea what the error message is? :? Have you tried Resetting the job via the Director? Sometimes that will get you some additional information labelled as "From previous run".
No I have not. :oops:

I did it now.
Still not getting much:

From previous run
DataStage Job 2741 Phantom 17305
Attempting to Cleanup after ABORT raised in stage ClearHashFile..JobControl

DataStage Phantom Aborting with @ABORT.CODE = 3
leomauer
Premium Member
Premium Member
Posts: 100
Joined: Mon Nov 03, 2003 1:33 pm

Post by leomauer »

kduke wrote:If the last 2 lines are not in the log then you are failing hard meaning you are probably core dumping. Your hash file may be corrupt or your filepath is bad or you need to quote it properly. If there is a space in this filepath then you could be clearing a directory or something other than what you really want to clear.
Core dumping sounds like true.
But yet ls of this file through the execution of the same code does not fail.
And if I am pointing to the non existing file souldn't I be getting "file not found" or something that.
kduke
Charter Member
Charter Member
Posts: 5227
Joined: Thu May 29, 2003 9:47 am
Location: Dallas, TX
Contact:

Post by kduke »

Some how it is doing something that it cannot recover from so you are not getting good error messages. Do the same command at the UNIX prompt but be in the project directory. See what happens.
Mamu Kim
leomauer
Premium Member
Premium Member
Posts: 100
Joined: Mon Nov 03, 2003 1:33 pm

Post by leomauer »

kduke wrote:Some how it is doing something that it cannot recover from so you are not getting good error messages. Do the same command at the UNIX prompt but be in the project directory. See what happens.
In my previous posting I've shown the result of the UNIX execution at prompt.
$ whence clear.file
/dsadm/Ascential/DataStage/DSEngine/bin/clear.file
$ /dsadm/Ascential/DataStage/DSEngine/bin/clear.file /mount/test/sa/hash/CST010_hash
Unable to open VOC file.
*** Processing cannot continue. ***
$ print $?
1
Does it mean that if the file not in VOC I can't clear it using this feature?
And yet it still gives me valid output and return code. I expected to see them in log even though the execution may not be succesful.
But as I said it aborts before getting to log entries.
kduke
Charter Member
Charter Member
Posts: 5227
Joined: Thu May 29, 2003 9:47 am
Location: Dallas, TX
Contact:

Post by kduke »

Are you in the project directory?
Mamu Kim
leomauer
Premium Member
Premium Member
Posts: 100
Joined: Mon Nov 03, 2003 1:33 pm

Post by leomauer »

kduke wrote:Are you in the project directory?
No. I am in my directry on UNIX.
But I woud like to be able to execute it in the DS director usin job control as it was suggested before, not at prompt.
kduke
Charter Member
Charter Member
Posts: 5227
Joined: Thu May 29, 2003 9:47 am
Location: Dallas, TX
Contact:

Post by kduke »

When you attach to a job then DataStage makes your current directory the project's directory. Ray was saying that this command will only work in the project directory. I have never used this command in this way so I would trust Ray.
Mamu Kim
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

I'm stumped on this one. :? I've created my own Before/After subroutine to do this, put it into a Server Job and all I've accomplished so far is to duplicate your situation - 'normal' O/S commands work fine, but executing clear.file craters the job. Server side tracing did not shed any light on what's going on for me, so I think I'll just wait for Ray to come back and straighten us all out.

:lol:
-craig

"You can never have too many knives" -- Logan Nine Fingers
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

Wow, nobody likes my earlier suggestion. You guys are having too much fun trying to make clear.file work from unix command. I don't think you can do this!!!

My earlier suggestion will work.
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
kduke
Charter Member
Charter Member
Posts: 5227
Joined: Thu May 29, 2003 9:47 am
Location: Dallas, TX
Contact:

Post by kduke »

Craig

Good idea. Try running a normal execute.

Code: Select all

Cmd = 'SH -c "clear.file ':filepath:'"
execute Cmd capturing output
print output
Mamu Kim
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

kcbland wrote:My earlier suggestion will work.
I know, I already do this. :wink: Now just trying to make this other new fangled method work.

I've done something else that looks a little goofy but also works. Drop a Transformer on the canvas and hook it to the Hash Lookup so that it thinks you want to write to it. Create a stage variable so the compiler won't complain and then set a constraint that allows no rows to pass through - something like @FALSE works fine. This gives you the ability to check the 'Clear' checkbox or delete and recreate the hash, if desired.
-craig

"You can never have too many knives" -- Logan Nine Fingers
leomauer
Premium Member
Premium Member
Posts: 100
Joined: Mon Nov 03, 2003 1:33 pm

Post by leomauer »

That how I did it:
We have a script that:
1. Checks if hash file is in VOC.
2. If not then creates the VOC entry on a fly.
3. The gouv and runs SELECT COUNT(*) FROM filename.
4. Then if the VOC entry was added then it is deleted
5. Exit UV and script.

I copied this script and replaced SELECT in step 3 with CLEAR.FILE filename.
It works, but I am still interested in clear.file utility.
Leo
ketfos
Participant
Posts: 562
Joined: Mon May 03, 2004 8:58 pm
Location: san francisco
Contact:

Post by ketfos »

Hi,
clear.file works from unix command as shown below.

The hash file test1 is created using account name.

/ardent/uv/DataStage/Projects/testproj $ clear.file test1
File "test1" has been cleared.
/ardent/uv/DataStage/Projects/testproj $
ketfos
Participant
Posts: 562
Joined: Mon May 03, 2004 8:58 pm
Location: san francisco
Contact:

Post by ketfos »

Hi
clear.file utility also works inside the job as
After Job routine using Exec-TCL - clear.file test1
Before Job using Exec - TCL - clear.file test1


Ketfos
Post Reply