Hash File Qtd Limits

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

I've never encountered that error before, but have created many hashed files simultaneously. I wonder if the error might be due to Windows drives. Is the "S" drive local to the DataStage server or on a mapped network drive? Could you test the hashed file creation on a local drive to see if the error is the same?
Loobian
Participant
Posts: 11
Joined: Fri Sep 09, 2005 5:13 am

Post by Loobian »

Hi ArndW,

At the time this error occurs, i've already created about 200 hash in the same directory/with various method (standard/not standard - altering the min modulus atribute with various values).

So i'm not suspecting of window drives... could it be cache limits?

Thanks

ArndW wrote:I've never encountered that error before, but have created many hashed files simultaneously. I wonder if the error might be due to Windows drives. Is the "S" drive local to the DataStage server or on ...
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Creating files doesn't get cached in DataStage. Is your "S" drive on the network? Can you try a test job with all the file to a local drive?
Loobian
Participant
Posts: 11
Joined: Fri Sep 09, 2005 5:13 am

Post by Loobian »

Hi,

My "S" directory is on the network.

The job is about loading from one hash file at the same directory(!), then it performs a couple of lookups and output the result to both informix table (works fine) and another hash in that (same) directory (abort).

I find it hard to understand, my first guess was limit of total hash files was reached (if there's one)...

I've foward this issue to the network team so they can check the ODBC connections and drivers.

I'll provide feedback of the resolution when it's done (if there's one :-D).

Thanks


ArndW wrote:Creating files doesn't get cached in DataStage. Is your "S" drive on the network? Can you try a test job with all the file to a local drive? ...
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

That's your problem. You should ALWAYS use physical drives local to the server for hashed files.
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

The system-wide limit on the number of simultaneously open dynamic (Type 30) hashed files is set by the T30FILE configuration parameter. If this value is 200, then you will fail to open (a job that creates it probably tries to open it) the 201st.

It is always a bad idea to perform lookups - of whatever kind - over a network. It's horribly inefficient. You have to put the key into a packet, send that to the other machine, wait for the other machine to do whatever it has to do, it then puts the resulting row into a packet and sends it back where you need to get it out of its packet. All this is unnecessary overhead, and the network latency doesn't help either.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
crang
Participant
Posts: 1
Joined: Tue Jul 29, 2003 12:22 pm
Location: USA
Contact:

Post by crang »

I am wondering if anyone found a solution to this issue. I have encountered it a couple of times now. We have tried to delete the temp file and rerun but the errors seems to screw something up in the project that will not get past the error. So the way we have gotten around it is by changing the name of the target file. The limit of 200 hash files was not a factor in any of the times this has happened.
Chuck Rang
Consultant
crang@ciber.com
(407) 563-6550
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

There might also be an operating system limit; the number of subdirectories in a directory. On Solaris it's 32765.

And it's "hashed" file, not "hash" file. :x
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
asitagrawal
Premium Member
Premium Member
Posts: 273
Joined: Wed Oct 18, 2006 12:20 pm
Location: Porto

Post by asitagrawal »

Hi,

Today, I also faced the same problem...
and the only soln turned out was name change...

I am yet to try for using a local disk and not a network disc...

Regards,
Asit
Share to Learn, and Learn to Share.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

When considering the "limit is not an issue" remember that the Repository tables are mostly hashed files, and you get five of these per job.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
sb_akarmarkar
Participant
Posts: 232
Joined: Fri Sep 30, 2005 4:52 am
Contact:

Post by sb_akarmarkar »

Hi,

Are you using delete/create hashed file option while creating lookup some time this wont work. Better to go with clear hash file and then loads option

Thank You,
Anupam
Post Reply