Page 1 of 1

Hash File Write failed for record id

Posted: Thu Jul 28, 2005 5:14 pm
by logic
Hi,
One job in my sequence just reads a source table and writes to a hash file. While reading records from source table Only records for past 100 days are selected by a where clause. All the jobs give results as expected even the number of records written to the hash file are as expected. Still the job and therefore the sequence fails with the following error

Code: Select all

CW_HR_delPS_CW_TA_CARD_HDR_Source..HSH_PS_CW_TA_CARD_HDR_Source.Lkp_PS_CW_TA_CARD_HDR: WriteHash() - Write failed for record id '59812
2003-11-15 00:00:00'
The where clause is

Code: Select all

END_DT BETWEEN SYSDATE - 100 AND SYSDATE
What I do not understand is why would the record for 2003 be selected if i have given the where clause as above? and why would the write to hash fail?
any comments will be helpful pls.
Thanks,
Ash.

Posted: Thu Jul 28, 2005 5:23 pm
by pnchowdary
Hi,

What database are you extracting the records from?

Posted: Thu Jul 28, 2005 5:25 pm
by logic
Oracle using DRS.
....and intrestingly the job does not generate any warning if i run it stand alone.
Thanks.

Posted: Thu Jul 28, 2005 5:32 pm
by pnchowdary
Hi,

Try running the same SQL in oracle and see whether it still selects that 2003 record ?

Posted: Thu Jul 28, 2005 5:34 pm
by logic
no it doesnt ...

Posted: Thu Jul 28, 2005 5:39 pm
by logic
Hi,
When I executed the job once again it didnt generate the error. Anyway,I am just going to remove the job from the sequence to be on safer side. Since the number of records is not much and I am using this job to create a lookup for the next job,i guess i will eliminate this job .Performance is not a issue here.
Question still remains that why it selected the 2003 record :x ???
Thanks

Posted: Thu Jul 28, 2005 9:26 pm
by ray.wurlod
The failure to write to the hashed file is usually indicative of a permissions problem (in which case you can't write ANY record to the hashed file) or a corrupted group (page) within the hashed file.

Usually you delete and re-create the hashed file on each run, so that the newly-created hashed file should be OK, but if you have a bad spot on the disk you still might see the problem. Perhaps on a different page.

As to how the date has become 2003, you need to trace the Transformer stage (Tracing tab in Job Run Options) capturing in-and-out data, and/or to run in Debugger. Is there, for example, a transformation applied to the date before it becomes a key column in the hashed file?

You might also like to send the output to a text file which you can examine with a hex editor, for example to determine whether there is any non-printing character forming part of the data.

Posted: Fri Jul 29, 2005 10:00 am
by logic
Hi,
My apologies for posting this topic in the wrong forum.
No. There is no transformation being applied to the date.
The date did not become 2003 during transformation..rather it was selected from the source inspite of there being a where clause. On rerun the 2003 date was not selected..nor is it being selected when I run the job now..So I dont know how I can trace the input and outout at transformer stage. Is there any way to trace what had happened in the previous run?
I wrote the data in text file and examined it...loks fine.
Thanks

Posted: Fri Oct 06, 2006 8:12 am
by Triton46
I've got this error and it appears to be permissions related. I (owner) can create/drop the hash file but my subordinate (user with group permissions) cannot. How do I rectify?

Posted: Fri Oct 06, 2006 9:10 am
by Triton46
NewDDWHash:
total 42
drwxrwsr-x 2 myaccount dstage 96 Nov 3 2005 ./
drwxrwsr-x 152 dsadm dstage 5120 Oct 6 15:44 ../
-rw-r--r-- 1 myaccount dstage 0 Nov 3 2005 .Type30
-rw-r--r-- 1 myaccount dstage 12288 Oct 6 05:30 DATA.30
-rw-r--r-- 1 myaccount dstage 4096 Oct 6 05:29 OVER.30

I tried to "chmod g+rw *Hash* but the DATA.30, .Type30 AND OVER.30 are unaffected.

Posted: Fri Oct 06, 2006 9:15 am
by ArndW
try "chmod -R 775 NewDDWHash" for a recursive change.

Posted: Fri Oct 06, 2006 11:30 am
by Triton46
That worked. Thanks!