Error when writing records to hash file

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
somu_june
Premium Member
Premium Member
Posts: 439
Joined: Wed Sep 14, 2005 9:28 am
Location: 36p,reading road

Error when writing records to hash file

Post by somu_june »

Hi,

Please help me to over come this problem. Iam using Hash file (Type 30) as look up to db2 table and Iam getting this error there are 9096946 records in DB2 tables when Iam writing this records to a hash file after 8367235 records the job is aborting saying this error


Error :

Insert_In2_Product_And_SW_Tables..Prod_Ctry_Hash.IDENT3: [IBM][CLI Driver] SQL0902C A system error (reason code = "18") occurred. Subsequent SQL statements cannot be processed. SQLSTATE=58005

[IBM][CLI Driver] SQL30081N A communication error has been detected. Communication protocol being used: "TCP/IP". Communication API being used: "SOCKETS". Location where the error was detected: "9.149.91.27". Communication function detecting the error: "recv". Protocol specific error code(s): "78", "*", "*". SQLSTATE=08001

SQLFetch: Error retrieving results from server.

Do I need to change any thing in hash file like modulus etc . I kept all as default values.


Thanks,
Somaraju
somaraju
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

My guess is that you blew thru the 2.2GB hash limit on the 32BIT hash file and the job aborted. Consider creating the file as a 64BIT file.
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
I_Server_Whale
Premium Member
Premium Member
Posts: 1255
Joined: Wed Feb 02, 2005 11:54 am
Location: United States of America

Post by I_Server_Whale »

Hi Somu,

The problem you are facing is not with the configuration of your hashed file.

But I suspect it is with your ODBC setup. Make sure that the correct drivers are setup.

Thanks,
Naveen.
Anything that won't sell, I don't want to invent. Its sale is proof of utility, and utility is success.
Author: Thomas A. Edison 1847-1931, American Inventor, Entrepreneur, Founder of GE
somu_june
Premium Member
Premium Member
Posts: 439
Joined: Wed Sep 14, 2005 9:28 am
Location: 36p,reading road

Post by somu_june »

Hi kcbland,

Can you please tell me how to create a 64 bit file . can I use the type 30 file or I have to select static types . If I select static type will there be any effect on performance. please tell me how to create a 64 bit file .




Thanks,
Somaraju
somaraju
somu_june
Premium Member
Premium Member
Posts: 439
Joined: Wed Sep 14, 2005 9:28 am
Location: 36p,reading road

Post by somu_june »

Hi Naveen,

Please correct me if Iam wrong. If I have a problem with odbc drivers then why it loaded 8323456 records and aborted and Iam using db2 api stage.



Thanks,
Somaraju
somaraju
I_Server_Whale
Premium Member
Premium Member
Posts: 1255
Joined: Wed Feb 02, 2005 11:54 am
Location: United States of America

Post by I_Server_Whale »

Hi,

Actually, you are correct. Like Ken said, it is probably a problem with the size of your hashed file.

You can use HFC to configure a hashed file for 64 bit processing. There are already lot of posts covering the HFC topic. Please search the forum.

Thanks,
Naveen.
Anything that won't sell, I don't want to invent. Its sale is proof of utility, and utility is success.
Author: Thomas A. Edison 1847-1931, American Inventor, Entrepreneur, Founder of GE
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

Chances are the job aborts and you're just getting a message from the CLI driver about it terminating. The root cause I'm guessing is the hash file size exceeded 2.2GB.
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
somu_june
Premium Member
Premium Member
Posts: 439
Joined: Wed Sep 14, 2005 9:28 am
Location: 36p,reading road

Post by somu_june »

Hi Naveen,


Thanks for the reply. I will search for HFC and Sorry for double posting when I posted for first time I could not see in the forms so I posted for the second time.




Thanks,
Somaraju
somaraju
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Chances are that you have corrupted the hashed file, so you will need to delete it and re-create it. Or just clear it of contents, which may be easier. To resize a hashed file to 64-bit, you need RESIZE, for which you will need a VOC pointer. Search the forum for SETFILE. The Hashed File stage does not include the capability to create a 64-bit hashed file.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
I_Server_Whale
Premium Member
Premium Member
Posts: 1255
Joined: Wed Feb 02, 2005 11:54 am
Location: United States of America

Post by I_Server_Whale »

Wonderful! Ray. That was good information.

Many Thanks,
Naveen.
Anything that won't sell, I don't want to invent. Its sale is proof of utility, and utility is success.
Author: Thomas A. Edison 1847-1931, American Inventor, Entrepreneur, Founder of GE
somu_june
Premium Member
Premium Member
Posts: 439
Joined: Wed Sep 14, 2005 9:28 am
Location: 36p,reading road

Post by somu_june »

Hi ray,

Thanks for your reply, can I resize dynamic 30 file to 64 bit and I read in one of the post that there is HFC.exe in client version of datastage. How can I use the HFC to see the type, modulus and seperation (I dont have HFC.exe with me) is there any other way than HFC and I have three key columms in hash file can I use type 18 .



Thanks,
somaraju
somaraju
I_Server_Whale
Premium Member
Premium Member
Posts: 1255
Joined: Wed Feb 02, 2005 11:54 am
Location: United States of America

Post by I_Server_Whale »

Hi Somu,

You can ask your DataStage admin for HFC.exe. I'm sure he must be aware of it.

Thanks,
Naveen.
Anything that won't sell, I don't want to invent. Its sale is proof of utility, and utility is success.
Author: Thomas A. Edison 1847-1931, American Inventor, Entrepreneur, Founder of GE
somu_june
Premium Member
Premium Member
Posts: 439
Joined: Wed Sep 14, 2005 9:28 am
Location: 36p,reading road

Post by somu_june »

Hi Naveen,

Thanks for your help.





Thanks,
Somaraju
somaraju
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

somu_june wrote:Hi ray,

Thanks for your reply, can I resize dynamic 30 file to 64 bit and I read in one of the post that there is HFC.exe in client version of datastage. How can I use the HFC to see the type, modulus and seperation (I dont have HFC.exe with me) is there any other way than HFC and I have three key columms in hash file can I use type 18 .

Thanks,
somaraju
Type 18 is not a dynamic hashed file. In the circumstances described earlier (loading an ad hoc set of rows from a DB2 table) I'd counsel against using a static hashed file. Stay with Type 30 (dynamic hashed).

You can see the tuning parameters in a number of ways. Probably the easiest is to use the UVfile command from the operating system prompt.

Code: Select all

$ UVfile D_DS_JOBOBJECTS
D_DS_JOBOBJECTS: Hashed file (DataStage Type 3, Modulo 1, Separation 2).

$ UVfile DS_JOBOBJECTS
DS_JOBOBJECTS: Dynamic file (DataStage Type 30). 
Not everything you want for dynamic hashed files. You can get that with the ANALYZE.FILE command within DataStage (for example from the Administrator client Command window). Separation is reported as Group Size, hashing algorithm eponymously, and modulo as both current and minimum.

Code: Select all

ANALYZE.FILE DS_JOBOBJECTS
File name.................. DS_JOBOBJECTS
Path name.................. DS_JOBOBJECTS
File type.................. DYNAMIC
Hashing Algorithm.......... GENERAL
No. of Groups (modulus).... 2 current ( minimum 1 )
Large record size.......... 1628 bytes
Group size................. 2048 bytes
Load factors............... 80% (split), 50% (merge) and 44% (actual)
Total size................. 24576 bytes
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply