You know what is not recommended and you are working to get rid of? And hashed files over 2GB are perfectly fine as long as they are 64BIT hashed files.
2000 or 2050 sounds a bit high to me... not sure if that would also cause an issue. We are supporting something like 15 projects with our T30FILE parameter setting of 500. Other opinions welcome!
BTW, what is your operating system?
Rare Error Messsage
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Hashed files can, if 64-bit addressing is used, be much bigger than 2GB if required.
Check whether "they" have any processes that are removing the .Type30 files, perhaps because the files are zero-length. If they do, oblige them to change the behaviour so as not to delete files called .Type30 because it is that which is causing the corruption - you don't have a hashed file any more - all you have is a directory, which will be substantially slower!
Check whether "they" have any processes that are removing the .Type30 files, perhaps because the files are zero-length. If they do, oblige them to change the behaviour so as not to delete files called .Type30 because it is that which is causing the corruption - you don't have a hashed file any more - all you have is a directory, which will be substantially slower!
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Coming back to this as we just ran into this message on a new Server install. COmpounding the trouble-shooting was the fact that this new HP-UX based server was installed with 7.5.2 of DataStage where all others are still on 7.5.1A. Testing and all that rot.
One particular hashed file would not create, and any attempts to do so would result in the job aborting with the same 'floating point exception' noted early on. Still trying to track down exactly what caused the issue, but had two things I noticed that were a little out of the ordinary:
1) Initial creation parameters were 'non-standard' - Large Record and Record Size had been changed and were equal.
2) A Varchar field was declared with a Length of 10,000.
Putting the hashed params back to their defaults and changing the Varchar to a LongVarchar seems to have solved the problem. Hopefully either Support will figure out the why or I'll get time to go back and see which bits actually fixed in. In the meantime, it needed to be fixed NOW.![Laughing :lol:](./images/smilies/icon_lol.gif)
One particular hashed file would not create, and any attempts to do so would result in the job aborting with the same 'floating point exception' noted early on. Still trying to track down exactly what caused the issue, but had two things I noticed that were a little out of the ordinary:
1) Initial creation parameters were 'non-standard' - Large Record and Record Size had been changed and were equal.
2) A Varchar field was declared with a Length of 10,000.
Putting the hashed params back to their defaults and changing the Varchar to a LongVarchar seems to have solved the problem. Hopefully either Support will figure out the why or I'll get time to go back and see which bits actually fixed in. In the meantime, it needed to be fixed NOW.
![Laughing :lol:](./images/smilies/icon_lol.gif)
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
RECORD.SIZE performs a calculation to set GROUP.SIZE and LARGE.RECORD. The problem's probably there. It's not a required tuning parameter and should therefore be left blank.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.