error writing into hash file..

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
scottr
Participant
Posts: 51
Joined: Thu Dec 02, 2004 11:20 am

error writing into hash file..

Post by scottr »

I know there are similar posts ,but pl have a look at this one.
these are 50 warngings
PT_RPOS_Lookup.LdLookup: ds_uvput() - Write failed for record id '92612921'
PT_RPOS_Lookup.LdLookup: ds_uvput() - Write failed for record id '92453581'

and the below is the message after
Attempting to Cleanup after ABORT raised in stage

DataStage Job 482 Phantom 11440
Program "DSD.StageRun": Line 571, Internal data error.
Program "DSD.StageRun": Line 571, Internal data error.
PT_RPOS_Lookup2003andup/DATA.30':
Computed blink of 0x830 does not match expected blink of 0x0!
Detected within group starting at address 0x90603000!
PT_RPOS_Lookup2003andup/DATA.30':
Computed blink of 0x830 does not match expected blink of 0x0!
Detected within group starting at address 0x89C27800!

does any one has advise for me like what's happend here.
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

You've corrupted the hash file, probably because you exceeded 2.2 gigabytes on a 32bit file. You could have also ran out of disk space.
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
scottr
Participant
Posts: 51
Joined: Thu Dec 02, 2004 11:20 am

Post by scottr »

i think ur correct it has 2434850816 on data.30 and 586504192 on over.30
how to correct it.i have to delete the old one and create new one. or else how to increase the size of old hash file..

thanks
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

First of all, are you attempting to put 500 million rows into a hash file? If so, you better contemplate a better solution. Just figure out how much raw character storage is required: 500 million rows of 20 characters per row puts you at 1 TB.

Hash files can hold that, but is it the solution I think you're looking to find? Based on your other post, I believe you are working on your data correction problem.

To directly answer your question, search the forum for "64BIT" hash files. That should get you past the 32BIT limitation. But, you'll be loading that hash file for days or weeks.
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
scottr
Participant
Posts: 51
Joined: Thu Dec 02, 2004 11:20 am

Post by scottr »

hi,
both are different problems.this one is a daily job loads around 150k recs into the hash file.and it has been running for a couple of months.now it is oversized and correpted.so switching 64bit is the only option??
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

You are attempting to fit 11 pounds of apples into a 10 pound apple box. Either remove unnecessary columns from the hash file to relieve the amount of data going into the file, or switch to creating it as 64BIT. There is a hard ceiling on 32BIT files, there's no way to squeeze more into that container. 2.2 gigabytes is it.
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
scottr
Participant
Posts: 51
Joined: Thu Dec 02, 2004 11:20 am

Post by scottr »

thanks a lot Kenneth Bland
T42
Participant
Posts: 499
Joined: Thu Nov 11, 2004 6:45 pm

Post by T42 »

Also keep in mind: HASH FILES ARE NOT A PERMANENT STORAGE SOLUTION! It's the same with Datasets for EE. They are to be used as a temporary storage for efficient processing down the stream.
Post Reply