OCI stage as hash file in server job

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
spallam
Participant
Posts: 7
Joined: Thu Mar 29, 2007 5:48 am
Location: nellore

OCI stage as hash file in server job

Post by spallam »

Hi all,
How to implement the below mentioned in PX job using OCI/..which satges

In server job a hash file created from table(AMALGT) is been used in a particular job as lookup hash file and also the same hash file as one of the target for inserting new found datafrom aparticular field.(i.e used dynamic hash file by enabling the loading propertie s)
so that in the job the new record which is inserted into hash file at target will be get reflected immediately in the lookup hash file during the same run and it will be used by further source records.

Please suggest how to implement the same server job model in the PX job . and the stages to be used keeping in view as OCI table (AMALGT).
And what are chages to be taken for partioning/node configuration.



Thanks & Regards
Spallam
Sai
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

You don't have access to hashed files in PX/EE, so you would need to do something such as write to a database table without buffering and immediate commit and then use a sparse lookup elsewhere in the job from that table.
Performance will be miserable, so it might make sense to revisit your job design, perhaps doing the job in 2 passes.
Post Reply