Analyzing Dynamic Hash File

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
asitagrawal
Premium Member
Premium Member
Posts: 273
Joined: Wed Oct 18, 2006 12:20 pm
Location: Porto

Analyzing Dynamic Hash File

Post by asitagrawal »

Hi,

I am trying to use ANALYZE.FILE D_HashTest

But I am getting the following error:
Illegal option found on command line: D_HashTest

Please note that D_HashTest is a dynamic hased file.

Plz suggest.
Thx
Last edited by asitagrawal on Wed Nov 22, 2006 1:00 pm, edited 1 time in total.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

No, it's not... but I'll wager HashTest is. :wink:
-craig

"You can never have too many knives" -- Logan Nine Fingers
asitagrawal
Premium Member
Premium Member
Posts: 273
Joined: Wed Oct 18, 2006 12:20 pm
Location: Porto

Re: Analyzing Dynamic Hash File

Post by asitagrawal »

asitagrawal wrote:Hi,

I am trying to use ANALYZE.FILE D_HashTest

But I am getting the following error:
Illegal option found on command line: D_HashTest

Please note that D_HashTest is a dynamic hased file.

Plz suggest.
Thx
Another Query:

Can someone please explain how the number of bytes calculated from the field length? Please chk the below URL...

http://asit.agrawal.googlepages.com/Query3.jpg

Thx
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

That is output from the HASH.HELP.DETAIL command or something similar. The record length is not calculated from the field sizes; it is actually measured from the data stored in the file.

What you have highlighted in the picture is merely the key columns. Chances are that there are non-key columns as well. These will contribute to the reported record size.

This utility does not take into account physical storage overheads, which may include any or all of:
  • record header (12,20,24 or 40 bytes)

    segment mark between key and data

    padding to whole word boundary
However I believe it does take into account the dynamic array delimiter characters within the data record itself.

Hashed File Calculator does attempt to take the storage overheads into account, except for the extra for oversized records.

By the way, what you analyzed is NOT a dynamic hashed file; it's a static hashed file (Type 3).
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
asitagrawal
Premium Member
Premium Member
Posts: 273
Joined: Wed Oct 18, 2006 12:20 pm
Location: Porto

Post by asitagrawal »

ray.wurlod wrote:That is output from the HASH.HELP.DETAIL command or something similar. The record length is not calculated from the field sizes; it is actually measured from the data stored in the file.

What you ...
Hi Ray,

Please trhow some light on relation between the data stored and the bytes, for a particular record (row).
I am tryin to use HFC utility, where I have to feed the Record Size and there it says, Record Size can be estimated from the Fields' Display Size and I have used the field display size same as the field length.

Regds
Asit
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

You have to count all fields, not just the key columns.
There is a single byte delimiter between each field. Count those too.
That is the figure to present to HFC - it does the rest.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
asitagrawal
Premium Member
Premium Member
Posts: 273
Joined: Wed Oct 18, 2006 12:20 pm
Location: Porto

Post by asitagrawal »

ray.wurlod wrote:You have to count all fields, not just the key columns.
There is a single byte delimiter between each field. Count those too.
That is the figure to present to HFC - it does the rest. ...
Hi Ray,
Can u plz confirm/correct the following:

Char - 1 Byte
Integer - 4 Bytes
SmallInt - 2 Bytes
Date - ?
Decimal - ?

Thx
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

You are not correct.
All data are stored as strings.
Char(N) = N bytes without NLS, up to 4N bytes with NLS
Integer = number of digits (therefore up to 10 characters, plus see above)
and so on.
That is why we advise to use the display width to estimate.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply