Page 1 of 1

strange problem with output hashed file

Posted: Wed Jul 12, 2006 8:48 am
by ppalka
Hi,

I have very strange problem with hashed files.
Simple job:
hf1--->Transformer--->hf2
Readable data in hf1.
In transformer I use three input columns as input arguments to routine. And then I use a return value from routine as one of output column.
But on my output I get about fifty records with unreadable data in that column. These wrong records start from about 100 record in HF. End about 150 record. And from that record to the end everything is good
And what solutions we find:
- the input column which is a first input arg to my routine is assigned to a stage variable and then that variable is put as first arg to that routine.
After that we get good results in output HF.
Also when we add new output from transformer to seq file it also gives a good result.

Can anybody explain what is going on? Is this some DS bug?
I can't find any rational explanation for that situation :(

Posted: Wed Jul 12, 2006 8:57 am
by kcbland
If you doubt the output, add an extra link to go to a Sequential file. Are you expecting sorted data in some fashion and using stage variables with COMMONs? Define unreadable data.

Posted: Wed Jul 12, 2006 9:06 am
by ppalka
kcbland wrote:If you doubt the output, add an extra link to go to a Sequential file. Are you expecting sorted data in some fashion and using stage variables with COMMONs? Define unreadable data. ...
Yes, I use COMMON variables, but not in this routine.
I don't need sorted data, I am sorting in further step.
Ureadable data - square sign, etc., chars which cannot be access directly through key on keybord.

Posted: Wed Jul 12, 2006 9:36 am
by kcbland
Spool to a file at the same time and see what the result looks like. Spool all values used in the function to the file as well as the result from the file. Look for what's wrong.

Posted: Wed Jul 12, 2006 9:54 am
by ppalka
kcbland wrote:Spool to a file at the same time and see what the result looks like. Spool all values used in the function to the file as well as the result from the file. Look for what's wrong.
But this is the issue. When I don;t have output to seq file I get wrong results in HF, but when I add seq file output I get fine results in HF and seq file

Posted: Wed Jul 12, 2006 9:58 am
by kcbland
Turn off row-buffering, write-caching, and inter-process and try again. I can't really tell you what you're doing wrong, but it could be a bug.

Posted: Wed Jul 12, 2006 10:03 am
by ppalka
kcbland wrote:Turn off row-buffering, write-caching, and inter-process and try again. I can't really tell you what you're doing wrong, but it could be a bug. ...
OK, I will try this tomorrow and I will get you know about the results.

Posted: Wed Jul 12, 2006 11:27 am
by ray.wurlod
Please post the routine code.

Posted: Thu Jul 13, 2006 1:59 am
by ppalka
kcbland wrote:Turn off row-buffering, write-caching, and inter-process and try again. I can't really tell you what you're doing wrong, but it could be a bug. ...
There was enabled "record level read" option in source hf. After disabling it I get good results. So is it the bug of DS? I think this option should work fine ...

Posted: Thu Jul 13, 2006 2:02 am
by ppalka
ray.wurlod wrote:Please post the routine code.
So if I found the reason of getting wrong results there is no need to post that routine code.

Posted: Thu Jul 13, 2006 6:28 am
by DSguru2B
Could you kindly share the solution with us, so that future posters, encountering a similar problem, will know what to do.

Posted: Thu Jul 13, 2006 6:53 am
by kcbland
ppalka wrote: There was enabled "record level read" option in source hf. After disabling it I get good results. So is it the bug of DS? I think this option should work fine ...
Why did you check this is in the first place?

Posted: Thu Jul 13, 2006 7:03 am
by ppalka
DSguru2B wrote:Could you kindly share the solution with us, so that future posters, encountering a similar problem, will know what to do.
I have just posted three solutions :)
Two tricks:
1. add extra output link from transformer;
2. pass input argument through stage variable.

And the finest solution, I think, to disable option "Record level read" on input hashed file of my job. In that case you do not need to change any design of your job.

Posted: Thu Jul 13, 2006 7:09 am
by ppalka
kcbland wrote:Why did you check this is in the first place?
It wasn't me. There were two more people working on that project. I don't know what was the purpose of checking that option in.
I think it could be cause of not having complementary knowledge of DS.