I'm getting a strange occurrence when I build the hash file.
The data in the source table contains a hypen i.e. '-' in this case the account number.
A key column on the hash file is the account number varchar(15).
Any account number with a hypen in it is not written to the hash file. There are no warnings in the job log to indicate a problem.
Are you SURE there is no constraint or other obstacle to the write? How many rows does the job indicate were sent to the hashed file? How many actually made it? Can you reproduce it with a different hashed file? How was the hashed file created?
I've just created a test job (7.5.1) and it's entirely happy to load rows into a hashed file with VarChar(15) key and hyphens in the data values - starting, intermediate and trailing, all were accepted.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
No constraint specified in the job. Out of 100 000 rows, get 15 with hyphens which do not appear in the hash file. The link information says that the rows were written to the hash file.
I thought maybe it was a character that Datastage displayed, when using the 'view data' option, that was perhaps something else. I then queries the source data using a sql tool and it showed a hyphen.
If you are creating your hashed file within your Account then you can execute the above statement. If not( if you are creating it as a pathed hashed file) you will have to use a SET.FILE command to make VOC entry for that hashed file.
Ok, another guess... perhaps your key choices are causing these 15 records to collapse down to one? Don't forget there are no such things as duplicates in a hashed file, it's destructive overwrite based on the keys defined - and last one in wins.
-craig
"You can never have too many knives" -- Logan Nine Fingers
ewartpm wrote: The link information says that the rows were written to the hash file.
No, the link information says that the rows were SENT to the hashed file.
Something has apparently prevented their being written.
Can you try two diagnostic things? First, replace the Hashed File stage with a Sequential File stage. Second, replace the Hashed File stage with a UV stage that refers to the same hashed file (you may need to create a VOC pointer, for example using the SETFILE command). Can you let us know what happens in each case?
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.