I have very strange problem with hashed files.
Simple job:
hf1--->Transformer--->hf2
Readable data in hf1.
In transformer I use three input columns as input arguments to routine. And then I use a return value from routine as one of output column.
But on my output I get about fifty records with unreadable data in that column. These wrong records start from about 100 record in HF. End about 150 record. And from that record to the end everything is good
And what solutions we find:
- the input column which is a first input arg to my routine is assigned to a stage variable and then that variable is put as first arg to that routine.
After that we get good results in output HF.
Also when we add new output from transformer to seq file it also gives a good result.
Can anybody explain what is going on? Is this some DS bug?
I can't find any rational explanation for that situation
![Sad :(](./images/smilies/icon_sad.gif)