Page 2 of 3

Posted: Thu Sep 23, 2004 9:20 am
by kduke
If the last 2 lines are not in the log then you are failing hard meaning you are probably core dumping. Your hash file may be corrupt or your filepath is bad or you need to quote it properly. If there is a space in this filepath then you could be clearing a directory or something other than what you really want to clear.

Posted: Thu Sep 23, 2004 9:21 am
by leomauer
chulett wrote:So, you have no idea what the error message is? :? Have you tried Resetting the job via the Director? Sometimes that will get you some additional information labelled as "From previous run".
No I have not. :oops:

I did it now.
Still not getting much:

From previous run
DataStage Job 2741 Phantom 17305
Attempting to Cleanup after ABORT raised in stage ClearHashFile..JobControl

DataStage Phantom Aborting with @ABORT.CODE = 3

Posted: Thu Sep 23, 2004 9:29 am
by leomauer
kduke wrote:If the last 2 lines are not in the log then you are failing hard meaning you are probably core dumping. Your hash file may be corrupt or your filepath is bad or you need to quote it properly. If there is a space in this filepath then you could be clearing a directory or something other than what you really want to clear.
Core dumping sounds like true.
But yet ls of this file through the execution of the same code does not fail.
And if I am pointing to the non existing file souldn't I be getting "file not found" or something that.

Posted: Thu Sep 23, 2004 11:38 am
by kduke
Some how it is doing something that it cannot recover from so you are not getting good error messages. Do the same command at the UNIX prompt but be in the project directory. See what happens.

Posted: Thu Sep 23, 2004 11:52 am
by leomauer
kduke wrote:Some how it is doing something that it cannot recover from so you are not getting good error messages. Do the same command at the UNIX prompt but be in the project directory. See what happens.
In my previous posting I've shown the result of the UNIX execution at prompt.
$ whence clear.file
/dsadm/Ascential/DataStage/DSEngine/bin/clear.file
$ /dsadm/Ascential/DataStage/DSEngine/bin/clear.file /mount/test/sa/hash/CST010_hash
Unable to open VOC file.
*** Processing cannot continue. ***
$ print $?
1
Does it mean that if the file not in VOC I can't clear it using this feature?
And yet it still gives me valid output and return code. I expected to see them in log even though the execution may not be succesful.
But as I said it aborts before getting to log entries.

Posted: Thu Sep 23, 2004 12:23 pm
by kduke
Are you in the project directory?

Posted: Thu Sep 23, 2004 12:35 pm
by leomauer
kduke wrote:Are you in the project directory?
No. I am in my directry on UNIX.
But I woud like to be able to execute it in the DS director usin job control as it was suggested before, not at prompt.

Posted: Thu Sep 23, 2004 1:00 pm
by kduke
When you attach to a job then DataStage makes your current directory the project's directory. Ray was saying that this command will only work in the project directory. I have never used this command in this way so I would trust Ray.

Posted: Thu Sep 23, 2004 1:58 pm
by chulett
I'm stumped on this one. :? I've created my own Before/After subroutine to do this, put it into a Server Job and all I've accomplished so far is to duplicate your situation - 'normal' O/S commands work fine, but executing clear.file craters the job. Server side tracing did not shed any light on what's going on for me, so I think I'll just wait for Ray to come back and straighten us all out.

:lol:

Posted: Thu Sep 23, 2004 2:27 pm
by kcbland
Wow, nobody likes my earlier suggestion. You guys are having too much fun trying to make clear.file work from unix command. I don't think you can do this!!!

My earlier suggestion will work.

Posted: Thu Sep 23, 2004 2:30 pm
by kduke
Craig

Good idea. Try running a normal execute.

Code: Select all

Cmd = 'SH -c "clear.file ':filepath:'"
execute Cmd capturing output
print output

Posted: Thu Sep 23, 2004 2:40 pm
by chulett
kcbland wrote:My earlier suggestion will work.
I know, I already do this. :wink: Now just trying to make this other new fangled method work.

I've done something else that looks a little goofy but also works. Drop a Transformer on the canvas and hook it to the Hash Lookup so that it thinks you want to write to it. Create a stage variable so the compiler won't complain and then set a constraint that allows no rows to pass through - something like @FALSE works fine. This gives you the ability to check the 'Clear' checkbox or delete and recreate the hash, if desired.

Posted: Thu Sep 23, 2004 2:49 pm
by leomauer
That how I did it:
We have a script that:
1. Checks if hash file is in VOC.
2. If not then creates the VOC entry on a fly.
3. The gouv and runs SELECT COUNT(*) FROM filename.
4. Then if the VOC entry was added then it is deleted
5. Exit UV and script.

I copied this script and replaced SELECT in step 3 with CLEAR.FILE filename.
It works, but I am still interested in clear.file utility.
Leo

Posted: Thu Sep 23, 2004 4:11 pm
by ketfos
Hi,
clear.file works from unix command as shown below.

The hash file test1 is created using account name.

/ardent/uv/DataStage/Projects/testproj $ clear.file test1
File "test1" has been cleared.
/ardent/uv/DataStage/Projects/testproj $

Posted: Thu Sep 23, 2004 4:33 pm
by ketfos
Hi
clear.file utility also works inside the job as
After Job routine using Exec-TCL - clear.file test1
Before Job using Exec - TCL - clear.file test1


Ketfos