Unable to allocate Type 30 descriptor, table is full
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 7
- Joined: Wed Nov 10, 2004 10:42 am
Unable to allocate Type 30 descriptor, table is full
What does it mean?
The entire error i get is:
DataStage Job 3037 Phantom 29927
Program "DSD.UVOpen": Line 389, Unable to allocate Type 30 descriptor, table is full.
DataStage Phantom Finished
Thanks
The entire error i get is:
DataStage Job 3037 Phantom 29927
Program "DSD.UVOpen": Line 389, Unable to allocate Type 30 descriptor, table is full.
DataStage Phantom Finished
Thanks
federico.l,
there is a DataStage configuration file ($dshome/uvconfig) setting T30FILE which control this value, it would seem that you have exceeded it at runtime.
You will need to edit this text file, increase the value to a more acceptable one, stop DataStage, regenerate the configuration file, then restart DataStage so that the new number of files is used in shared memory.
A search in this forum of T30FILE might yield more information if you require it.
What is your T30FILE setting? Is your system that busy?
there is a DataStage configuration file ($dshome/uvconfig) setting T30FILE which control this value, it would seem that you have exceeded it at runtime.
You will need to edit this text file, increase the value to a more acceptable one, stop DataStage, regenerate the configuration file, then restart DataStage so that the new number of files is used in shared memory.
A search in this forum of T30FILE might yield more information if you require it.
What is your T30FILE setting? Is your system that busy?
<a href=http://www.worldcommunitygrid.org/team/ ... TZ9H4CGVP1 target="WCGWin">
</a>
</a>
-
- Participant
- Posts: 3337
- Joined: Mon Jan 17, 2005 4:49 am
- Location: United Kingdom
-
- Participant
- Posts: 7
- Joined: Wed Nov 10, 2004 10:42 am
-
- Participant
- Posts: 7
- Joined: Wed Nov 10, 2004 10:42 am
Re: Unable to allocate Type 30 descriptor, table is full
Are you trying to populate data into Hash File ??? , If yes then , this is Hash file size limitation problem . in DataStage - 7 , default hash file size is 2GB ( 32 BIT ) and if size exceeds this value then it will failed and give error .
Solution :- You need to resize Hash file ( 64 BIT ) from Universe command , pl be ensure no activity on hash file , neither reading not writing hash file )
command > RESIZE HASFILE * * * 64 BIT
Regards
Arun Verma
Solution :- You need to resize Hash file ( 64 BIT ) from Universe command , pl be ensure no activity on hash file , neither reading not writing hash file )
command > RESIZE HASFILE * * * 64 BIT
Regards
Arun Verma
Arun Verma
-
- Premium Member
- Posts: 291
- Joined: Wed Sep 26, 2007 11:23 am
- Location: Madrid, Spain
Hi all,ArndW wrote: You will need to edit this text file, increase the value to a more acceptable one, stop DataStage, regenerate the configuration file, then restart DataStage so that the new number of files is used in shared memory.
I need to go on this task, and I perfectly understand all the steps, but the one in bold
Are you referring one of the config files used in DS EEE to define number of nodes? arent they just text files that you can swap between them when
I also read in other posts about this that it is neccesary to execute uvregen. Havent heard about this before
I pray everyday my customer moves to Datastage 8.1
-
- Participant
- Posts: 3337
- Joined: Mon Jan 17, 2005 4:49 am
- Location: United Kingdom
-
- Premium Member
- Posts: 291
- Joined: Wed Sep 26, 2007 11:23 am
- Location: Madrid, Spain