Hash File Qtd Limits
Moderators: chulett, rschirm, roy
I've never encountered that error before, but have created many hashed files simultaneously. I wonder if the error might be due to Windows drives. Is the "S" drive local to the DataStage server or on a mapped network drive? Could you test the hashed file creation on a local drive to see if the error is the same?
<a href=http://www.worldcommunitygrid.org/team/ ... TZ9H4CGVP1 target="WCGWin">
</a>
</a>
Hi ArndW,
At the time this error occurs, i've already created about 200 hash in the same directory/with various method (standard/not standard - altering the min modulus atribute with various values).
So i'm not suspecting of window drives... could it be cache limits?
Thanks
At the time this error occurs, i've already created about 200 hash in the same directory/with various method (standard/not standard - altering the min modulus atribute with various values).
So i'm not suspecting of window drives... could it be cache limits?
Thanks
ArndW wrote:I've never encountered that error before, but have created many hashed files simultaneously. I wonder if the error might be due to Windows drives. Is the "S" drive local to the DataStage server or on ...
Creating files doesn't get cached in DataStage. Is your "S" drive on the network? Can you try a test job with all the file to a local drive?
<a href=http://www.worldcommunitygrid.org/team/ ... TZ9H4CGVP1 target="WCGWin">
</a>
</a>
Hi,
My "S" directory is on the network.
The job is about loading from one hash file at the same directory(!), then it performs a couple of lookups and output the result to both informix table (works fine) and another hash in that (same) directory (abort).
I find it hard to understand, my first guess was limit of total hash files was reached (if there's one)...
I've foward this issue to the network team so they can check the ODBC connections and drivers.
I'll provide feedback of the resolution when it's done (if there's one ).
Thanks
My "S" directory is on the network.
The job is about loading from one hash file at the same directory(!), then it performs a couple of lookups and output the result to both informix table (works fine) and another hash in that (same) directory (abort).
I find it hard to understand, my first guess was limit of total hash files was reached (if there's one)...
I've foward this issue to the network team so they can check the ODBC connections and drivers.
I'll provide feedback of the resolution when it's done (if there's one ).
Thanks
ArndW wrote:Creating files doesn't get cached in DataStage. Is your "S" drive on the network? Can you try a test job with all the file to a local drive? ...
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
The system-wide limit on the number of simultaneously open dynamic (Type 30) hashed files is set by the T30FILE configuration parameter. If this value is 200, then you will fail to open (a job that creates it probably tries to open it) the 201st.
It is always a bad idea to perform lookups - of whatever kind - over a network. It's horribly inefficient. You have to put the key into a packet, send that to the other machine, wait for the other machine to do whatever it has to do, it then puts the resulting row into a packet and sends it back where you need to get it out of its packet. All this is unnecessary overhead, and the network latency doesn't help either.
It is always a bad idea to perform lookups - of whatever kind - over a network. It's horribly inefficient. You have to put the key into a packet, send that to the other machine, wait for the other machine to do whatever it has to do, it then puts the resulting row into a packet and sends it back where you need to get it out of its packet. All this is unnecessary overhead, and the network latency doesn't help either.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
I am wondering if anyone found a solution to this issue. I have encountered it a couple of times now. We have tried to delete the temp file and rerun but the errors seems to screw something up in the project that will not get past the error. So the way we have gotten around it is by changing the name of the target file. The limit of 200 hash files was not a factor in any of the times this has happened.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
-
- Premium Member
- Posts: 273
- Joined: Wed Oct 18, 2006 12:20 pm
- Location: Porto
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
-
- Participant
- Posts: 232
- Joined: Fri Sep 30, 2005 4:52 am
- Contact: