Page 1 of 1

Unable To Open Hash File - MultiInstance Job

Posted: Tue Jun 29, 2004 1:31 am
by arunverma
We have multistance job , which data extract from seven serevr and load data into DSS server , at the time of data extraction we create Hash File for each server and at the time of data loading we are using this Hash File .

for each server we are creating hass file like
Hash_file_11
Hash_file_12
Hash_file_13

etc , so ther is no chances to write or read at same time , and this application is running for last 6 month , we got following error yesterday
while loading data into DSS - Unable to Open Hash file .

DSD.UVOpen Unable to open file HProductCancelReason_51.

DataStage Job 836 Phantom 25631
Program "DSD.UVOpen": Line 396, Unable to allocate Type 30 descriptor, table is full.
Job Aborted after Fatal Error logged.
Attempting to Cleanup after ABORT raised in stage J10IcaLor.51.T1
DataStage Phantom Aborting with @ABORT.CODE = 1


Pl. help me to resolve this issue .

Thanks and Regards

Arun Verma

Posted: Tue Jun 29, 2004 3:57 am
by ray.wurlod
Arun,

"File descriptor table is full" refers to the in-memory table of concurrently open dynamic hashed files.

The size of this tables is set by the T30FILE configuration parameter in the uvconfig file. Once you have increased this, you need to run the uvregen utility, then stop and re-start DataStage.

Are you trying to make your machine work too hard?!! :roll:

Regards,
Ray

Posted: Tue Jun 29, 2004 6:45 am
by arunverma
Dear Mr. Ray

We have checked server log , when this error occur that time a lot of application was running , we have change schedule time , lets see tomorrow .

We have SUN server with 24 CPU M/c 48 GB RAM , the value in uvconfig for T30FILE is "T30FILE 1000" , so should we increase ? .


Thanks and Regard


Arun Verma

Posted: Tue Jun 29, 2004 3:58 pm
by ray.wurlod
It's the only way to fix this problem.

Provided uvregen doesn't report a figure too close to the maximum shared memory segment size for your system, you could try increasing T30FILE to, say, 1500 or 2000. Each additional slot requires, as far as I can recall, just over 100 bytes of memory.

The only other approach is to investigate the use of static hashed files rather than dynamic, because static hashed files do not occupy a slot in the T30FILE table in memory.