Error when writing records to hash file
Moderators: chulett, rschirm, roy
Error when writing records to hash file
Hi,
Please help me to over come this problem. Iam using Hash file (Type 30) as look up to db2 table and Iam getting this error there are 9096946 records in DB2 tables when Iam writing this records to a hash file after 8367235 records the job is aborting saying this error
Error :
Insert_In2_Product_And_SW_Tables..Prod_Ctry_Hash.IDENT3: [IBM][CLI Driver] SQL0902C A system error (reason code = "18") occurred. Subsequent SQL statements cannot be processed. SQLSTATE=58005
[IBM][CLI Driver] SQL30081N A communication error has been detected. Communication protocol being used: "TCP/IP". Communication API being used: "SOCKETS". Location where the error was detected: "9.149.91.27". Communication function detecting the error: "recv". Protocol specific error code(s): "78", "*", "*". SQLSTATE=08001
SQLFetch: Error retrieving results from server.
Do I need to change any thing in hash file like modulus etc . I kept all as default values.
Thanks,
Somaraju
Please help me to over come this problem. Iam using Hash file (Type 30) as look up to db2 table and Iam getting this error there are 9096946 records in DB2 tables when Iam writing this records to a hash file after 8367235 records the job is aborting saying this error
Error :
Insert_In2_Product_And_SW_Tables..Prod_Ctry_Hash.IDENT3: [IBM][CLI Driver] SQL0902C A system error (reason code = "18") occurred. Subsequent SQL statements cannot be processed. SQLSTATE=58005
[IBM][CLI Driver] SQL30081N A communication error has been detected. Communication protocol being used: "TCP/IP". Communication API being used: "SOCKETS". Location where the error was detected: "9.149.91.27". Communication function detecting the error: "recv". Protocol specific error code(s): "78", "*", "*". SQLSTATE=08001
SQLFetch: Error retrieving results from server.
Do I need to change any thing in hash file like modulus etc . I kept all as default values.
Thanks,
Somaraju
somaraju
My guess is that you blew thru the 2.2GB hash limit on the 32BIT hash file and the job aborted. Consider creating the file as a 64BIT file.
Kenneth Bland
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
-
- Premium Member
- Posts: 1255
- Joined: Wed Feb 02, 2005 11:54 am
- Location: United States of America
Hi Somu,
The problem you are facing is not with the configuration of your hashed file.
But I suspect it is with your ODBC setup. Make sure that the correct drivers are setup.
Thanks,
Naveen.
The problem you are facing is not with the configuration of your hashed file.
But I suspect it is with your ODBC setup. Make sure that the correct drivers are setup.
Thanks,
Naveen.
Anything that won't sell, I don't want to invent. Its sale is proof of utility, and utility is success.
Author: Thomas A. Edison 1847-1931, American Inventor, Entrepreneur, Founder of GE
Author: Thomas A. Edison 1847-1931, American Inventor, Entrepreneur, Founder of GE
-
- Premium Member
- Posts: 1255
- Joined: Wed Feb 02, 2005 11:54 am
- Location: United States of America
Hi,
Actually, you are correct. Like Ken said, it is probably a problem with the size of your hashed file.
You can use HFC to configure a hashed file for 64 bit processing. There are already lot of posts covering the HFC topic. Please search the forum.
Thanks,
Naveen.
Actually, you are correct. Like Ken said, it is probably a problem with the size of your hashed file.
You can use HFC to configure a hashed file for 64 bit processing. There are already lot of posts covering the HFC topic. Please search the forum.
Thanks,
Naveen.
Anything that won't sell, I don't want to invent. Its sale is proof of utility, and utility is success.
Author: Thomas A. Edison 1847-1931, American Inventor, Entrepreneur, Founder of GE
Author: Thomas A. Edison 1847-1931, American Inventor, Entrepreneur, Founder of GE
Chances are the job aborts and you're just getting a message from the CLI driver about it terminating. The root cause I'm guessing is the hash file size exceeded 2.2GB.
Kenneth Bland
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Chances are that you have corrupted the hashed file, so you will need to delete it and re-create it. Or just clear it of contents, which may be easier. To resize a hashed file to 64-bit, you need RESIZE, for which you will need a VOC pointer. Search the forum for SETFILE. The Hashed File stage does not include the capability to create a 64-bit hashed file.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Premium Member
- Posts: 1255
- Joined: Wed Feb 02, 2005 11:54 am
- Location: United States of America
Hi ray,
Thanks for your reply, can I resize dynamic 30 file to 64 bit and I read in one of the post that there is HFC.exe in client version of datastage. How can I use the HFC to see the type, modulus and seperation (I dont have HFC.exe with me) is there any other way than HFC and I have three key columms in hash file can I use type 18 .
Thanks,
somaraju
Thanks for your reply, can I resize dynamic 30 file to 64 bit and I read in one of the post that there is HFC.exe in client version of datastage. How can I use the HFC to see the type, modulus and seperation (I dont have HFC.exe with me) is there any other way than HFC and I have three key columms in hash file can I use type 18 .
Thanks,
somaraju
somaraju
-
- Premium Member
- Posts: 1255
- Joined: Wed Feb 02, 2005 11:54 am
- Location: United States of America
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Type 18 is not a dynamic hashed file. In the circumstances described earlier (loading an ad hoc set of rows from a DB2 table) I'd counsel against using a static hashed file. Stay with Type 30 (dynamic hashed).somu_june wrote:Hi ray,
Thanks for your reply, can I resize dynamic 30 file to 64 bit and I read in one of the post that there is HFC.exe in client version of datastage. How can I use the HFC to see the type, modulus and seperation (I dont have HFC.exe with me) is there any other way than HFC and I have three key columms in hash file can I use type 18 .
Thanks,
somaraju
You can see the tuning parameters in a number of ways. Probably the easiest is to use the UVfile command from the operating system prompt.
Code: Select all
$ UVfile D_DS_JOBOBJECTS
D_DS_JOBOBJECTS: Hashed file (DataStage Type 3, Modulo 1, Separation 2).
$ UVfile DS_JOBOBJECTS
DS_JOBOBJECTS: Dynamic file (DataStage Type 30).
Code: Select all
ANALYZE.FILE DS_JOBOBJECTS
File name.................. DS_JOBOBJECTS
Path name.................. DS_JOBOBJECTS
File type.................. DYNAMIC
Hashing Algorithm.......... GENERAL
No. of Groups (modulus).... 2 current ( minimum 1 )
Large record size.......... 1628 bytes
Group size................. 2048 bytes
Load factors............... 80% (split), 50% (merge) and 44% (actual)
Total size................. 24576 bytes
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.