Error Writing to Hashed File

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
gsherry1
Charter Member
Charter Member
Posts: 173
Joined: Fri Jun 17, 2005 8:31 am
Location: Canada

Error Writing to Hashed File

Post by gsherry1 »

Hello Forum

I received the following error:
MyJobName.MYINVOCID.CntHistHash.CountHistory: ds_uvput() - Write failed for record id 'MYINVOCID
13952
SplitLoad
Routing
RoutingSource'
Some other information about the job containing this hashed file:

1. The file is not large. (< 20MB)
2. There are no special characters in the key fields.
3. There are no nulls being mapped to the key fields in the hashed file. The data is populated directly from table and uses the same key as the primary key of the table.
4. Write Caching is disabled.
5. The hashed file is in a multi-instance job, but the hashed file filename is based off of the invocation id.

Any suggested reasons for this error?

Thanks,

Greg
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Greg, you have special characters in your key field. Most likely CHAR(254) (the @FM). Use a OCONV(YourColumn,'MCP') to change undisplayable values to a period and check your data again.
gsherry1
Charter Member
Charter Member
Posts: 173
Joined: Fri Jun 17, 2005 8:31 am
Location: Canada

Post by gsherry1 »

ArndW wrote:Greg, you have special characters in your key field. Most likely CHAR(254) (the @FM). Use a OCONV(YourColumn,'MCP') to change undisplayable values to a period and check your data again. ...
ArndW,

Thanks for your response. The data unloaded from table and placed into hashed file has run before and since this error with no trouble. The content of the data in the table has not changed.

In particular, the one problematic record that caused the error:

Varchar(10):MYINVOCID
Date:13952
Varchar(100):SplitLoad
Varchar(50):Routing
Varchar(50):RoutingSource

The date of the data causing the error is 13952 (March 13, 2006). I have unloaded this data manually and placed it into hex viewer and checked for any suspicous values, and have found none.

So if special characters were the issue, they must have been introduced by DS for this one execution, and not by the source data itself.

- Greg
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Greg,

Your composite key of 5 columns could not be written to the hashed file. The data format for such things as date or times is irrelevant.

This error will happen on DataStage hashed files when:

(a) the disk is full and new space needs to be allocated for a write operation and cannot be retrieved from the system
(b) the file has internal corruption
(c) the key contains illegal characters, specifically the @FM or NULL or empty string.
(d) the key is longer than the maxiumum length configured.

So you have 4 basic causes to choose from. If you think that the cause is DataStage inserting some spurious character on one run then it is going to be impossible to solve it in this forum as it isn't reproduceable. If you haven't seen this error before or since then the likely culprit is (a). It might have been (b) if your job deletes/re-creates the file each run since that would have removed the evidence of a corrupted hashed file on the subsequent run. It seems that (d) is ruled out as well.
Post Reply