Write failed on hash file
Posted: Tue Dec 14, 2004 3:35 pm
I have a server job that received a ds_uvput() write error.
ds_uvput() - Write failed for record id 'COUNTRY
JM'
When I re-ran the job it completed without error but some rows were missing from the hash file.
Other posts seemed to be resolved as space issues but I have 7 GB available and this is a very small hash (< 15,000 rows).
The hash is being written to the project directory. The job is not using write caching.
I have recently upgraded to version 7.5 from version 6.0.1. This job has been running for a year and a half under version 6 without failure.
One thing that may be unusual about this job is that it is writing to the same hash file with multiple links. The job reads from a single Oracle stage into a single transformer and then the transformer has multiple links into a single hash file stage. The transformer has logic to set part of the hash file key fields based on the input.
When I re-ran the job it finished without error but all of the rows were not written to the hash. The row count shown in the log is correct but the rows were not in the hash file.
Running the job a third time did write all the rows to the hash file.
I have opened a case with Ascential tech support but I wanted to see if anyone else had run into this type of problem.
ds_uvput() - Write failed for record id 'COUNTRY
JM'
When I re-ran the job it completed without error but some rows were missing from the hash file.
Other posts seemed to be resolved as space issues but I have 7 GB available and this is a very small hash (< 15,000 rows).
The hash is being written to the project directory. The job is not using write caching.
I have recently upgraded to version 7.5 from version 6.0.1. This job has been running for a year and a half under version 6 without failure.
One thing that may be unusual about this job is that it is writing to the same hash file with multiple links. The job reads from a single Oracle stage into a single transformer and then the transformer has multiple links into a single hash file stage. The transformer has logic to set part of the hash file key fields based on the input.
When I re-ran the job it finished without error but all of the rows were not written to the hash. The row count shown in the log is correct but the rows were not in the hash file.
Running the job a third time did write all the rows to the hash file.
I have opened a case with Ascential tech support but I wanted to see if anyone else had run into this type of problem.