URGENT - Creation of Hash File/Transformer - Error

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
MaheshKumar Sugunaraj
Participant
Posts: 84
Joined: Thu Dec 04, 2003 9:55 pm

URGENT - Creation of Hash File/Transformer - Error

Post by MaheshKumar Sugunaraj »

Hi,

When I am executing the job the executes and I get the following error

DataStage Job 35 Phantom 28398
Program "DSU.TransactionType": Line 14, Unable to open the operating system file "DSU_BP.O/DSTransformerError".
[ENOENT] No such file or directory
Program "DSU.TransactionType": Line 14, Unable to load file "DSTransformerError".
Program "DSU.TransactionType": Line 14, Unable to load subroutine.
Attempting to Cleanup after ABORT raised in stage sourceToGSIR..xForm
DataStage Phantom Aborting with @ABORT.CODE = 3

I am only trying to load 2000000 million records in to the Hash File (Though I know it is not advisable to load it into the Hash File)from the DWH.

I checked the Hash File and also the routines which I use in side the transformed, Please let me know.

Thanks & Regards
SMK
sumitgulati
Participant
Posts: 197
Joined: Mon Feb 17, 2003 11:20 pm
Location: India

Post by sumitgulati »

"DataStage Job 35" means that your Job code is 35.
In your system that holds DS server go into RT_BP35 folder (It will be in Ascential\DataStage\Projects|<ProjectName>\). There you would find files for active stages in your job. Open those files and check the line number 14. Try to find out what exactly this line is trying to do. This can help.

Regards,
Sumit
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

You're trying to us the DataStage function DSTransformError (congratulations - it's the right one to use).

Unfortunately, you added an extra, and incorrect, er to its name (DSTransformerError) so DataStage can't find it.

All the message relate to the different places it searches.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
peternolan9
Participant
Posts: 214
Joined: Mon Feb 23, 2004 2:10 am
Location: Dublin, Ireland
Contact:

Re: URGENT - Creation of Hash File/Transformer - Error

Post by peternolan9 »

MaheshKumar,
if it is urgent, call Ascential support. This forum is not formal support. People help out as best they can. You cannot expect any faster reply by saying 'Urgent'.

Now, do you mean 2 trillion rows in a hash file or 2 million....? Assuming you mean 2 million.....you need to be careful how large the hash file is.

Seach for my name on this forum and you will see extensive discussions on hash file sizes...even in version 7 hash files have a hard limit of 999MB.

If you cannot get a hash file under 999MB you might as well use the database.....it won't be that much slower than the hash file if you make sure the data is in memory and does not get swapped out (can afford to pin the table).....we had to make some changes to our code to get in under the 999MB limit.

Also, loading large hash files is surprisingly slow....Tom Nel is fast load hash file guru now as he did a lot of testing for me on my last project on making hash files load faster.....

Like I said, there was a ton of discussion about it..



MaheshKumar Sugunaraj wrote:Hi,

I am only trying to load 2000000 million records in to the Hash File (Though I know it is not advisable to load it into the Hash File)from the DWH.

I checked the Hash File and also the routines which I use in side the transformed, Please let me know.

Thanks & Regards
SMK
Best Regards
Peter Nolan
www.peternolan.com
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Hashed files do NOT have a hard-coded limit of 999MB, and never have.

This limit occurs only in the GUI, where "someone" neglected to allow more than three digits in the SpinButton control.

By default, hashed files have a limit of 2GB, caused by internal pointers being four bytes (32 bits). By explicitly specifying eight byte (64 bit) pointers (when the hashed file is created or subsequently resized), hashed files have a theoretical upper limit of around 19 million TB.

I still believe your error is caused by your mis-spelling the name of the function, and nothing at all to do with hashed files.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
peternolan9
Participant
Posts: 214
Joined: Mon Feb 23, 2004 2:10 am
Location: Dublin, Ireland
Contact:

Post by peternolan9 »

Hi Ray,
sorry, you are correct...I meant only hash files under 999MB can be loaded into memory.....in the testing we did we found that above that limit it was not worth using hash files.....so we made changes to get all our hash files in under the limit.....I don't thinks it's just a case of the spinn button not going past 999, I think there is some limit inside it hard coded to disallow larger files to be loaded into memory....

Best Regards

Peter
ray.wurlod wrote:Hashed files do NOT have a hard-coded limit of 999MB, and never have.

This limit occurs only in the GUI, where "someone" neglected to allow more than three digits in the SpinButton control.

By default, hashed files have a limit of 2GB, caused by internal pointers being four bytes (32 bits). By explicitly specifying eight byte (64 bit) pointers (when the hashed file is created or subsequently resized), hashed files have a theoretical upper limit of around 19 million TB.

I still believe your error is caused by your mis-spelling the name of the function, and nothing at all to do with hashed files.
Best Regards
Peter Nolan
www.peternolan.com
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

ray.wurlod wrote:I still believe your error is caused by your mis-spelling the name of the function, and nothing at all to do with hashed files.
As do we all. Peter's just trying to get a clarification on the "I am only trying to load 2000000 million records in to the Hash File" comment, which had me wondering as well. Seems like a pretty big 'only' to me. :wink:
-craig

"You can never have too many knives" -- Logan Nine Fingers
MaheshKumar Sugunaraj
Participant
Posts: 84
Joined: Thu Dec 04, 2003 9:55 pm

Post by MaheshKumar Sugunaraj »

Hi All,

Thanks for your valuable suggestions, I have opted rather use a Sequential File Stage for the output instead of a Hash file.

Because most of the Lookups use other tables with lesser number of data for lookups, Thanks Ray.

I also apologize for marking the post as URGENT.

Thanks
SMK
Post Reply