Hi,
I am trying to use ANALYZE.FILE D_HashTest
But I am getting the following error:
Illegal option found on command line: D_HashTest
Please note that D_HashTest is a dynamic hased file.
Plz suggest.
Thx
Analyzing Dynamic Hash File
Moderators: chulett, rschirm, roy
-
- Premium Member
- Posts: 273
- Joined: Wed Oct 18, 2006 12:20 pm
- Location: Porto
Analyzing Dynamic Hash File
Last edited by asitagrawal on Wed Nov 22, 2006 1:00 pm, edited 1 time in total.
-
- Premium Member
- Posts: 273
- Joined: Wed Oct 18, 2006 12:20 pm
- Location: Porto
Re: Analyzing Dynamic Hash File
Another Query:asitagrawal wrote:Hi,
I am trying to use ANALYZE.FILE D_HashTest
But I am getting the following error:
Illegal option found on command line: D_HashTest
Please note that D_HashTest is a dynamic hased file.
Plz suggest.
Thx
Can someone please explain how the number of bytes calculated from the field length? Please chk the below URL...
http://asit.agrawal.googlepages.com/Query3.jpg
Thx
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
That is output from the HASH.HELP.DETAIL command or something similar. The record length is not calculated from the field sizes; it is actually measured from the data stored in the file.
What you have highlighted in the picture is merely the key columns. Chances are that there are non-key columns as well. These will contribute to the reported record size.
This utility does not take into account physical storage overheads, which may include any or all of:
Hashed File Calculator does attempt to take the storage overheads into account, except for the extra for oversized records.
By the way, what you analyzed is NOT a dynamic hashed file; it's a static hashed file (Type 3).
What you have highlighted in the picture is merely the key columns. Chances are that there are non-key columns as well. These will contribute to the reported record size.
This utility does not take into account physical storage overheads, which may include any or all of:
- record header (12,20,24 or 40 bytes)
segment mark between key and data
padding to whole word boundary
Hashed File Calculator does attempt to take the storage overheads into account, except for the extra for oversized records.
By the way, what you analyzed is NOT a dynamic hashed file; it's a static hashed file (Type 3).
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Premium Member
- Posts: 273
- Joined: Wed Oct 18, 2006 12:20 pm
- Location: Porto
Hi Ray,ray.wurlod wrote:That is output from the HASH.HELP.DETAIL command or something similar. The record length is not calculated from the field sizes; it is actually measured from the data stored in the file.
What you ...
Please trhow some light on relation between the data stored and the bytes, for a particular record (row).
I am tryin to use HFC utility, where I have to feed the Record Size and there it says, Record Size can be estimated from the Fields' Display Size and I have used the field display size same as the field length.
Regds
Asit
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
You have to count all fields, not just the key columns.
There is a single byte delimiter between each field. Count those too.
That is the figure to present to HFC - it does the rest.
There is a single byte delimiter between each field. Count those too.
That is the figure to present to HFC - it does the rest.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Premium Member
- Posts: 273
- Joined: Wed Oct 18, 2006 12:20 pm
- Location: Porto
Hi Ray,ray.wurlod wrote:You have to count all fields, not just the key columns.
There is a single byte delimiter between each field. Count those too.
That is the figure to present to HFC - it does the rest. ...
Can u plz confirm/correct the following:
Char - 1 Byte
Integer - 4 Bytes
SmallInt - 2 Bytes
Date - ?
Decimal - ?
Thx
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
You are not correct.
All data are stored as strings.
Char(N) = N bytes without NLS, up to 4N bytes with NLS
Integer = number of digits (therefore up to 10 characters, plus see above)
and so on.
That is why we advise to use the display width to estimate.
All data are stored as strings.
Char(N) = N bytes without NLS, up to 4N bytes with NLS
Integer = number of digits (therefore up to 10 characters, plus see above)
and so on.
That is why we advise to use the display width to estimate.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.