delete data from hash file

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
harryhome
Participant
Posts: 112
Joined: Wed Oct 18, 2006 7:10 am

delete data from hash file

Post by harryhome »

Hi we need delete few rows from hash file on linux.

please let us know the way.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Terminology: it's hashed file, not hash file.

It is not possible to delete rows from a hashed file from a parallel job.

There is no support for hashed files in parallel jobs (other than by including a server Shared Container).
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
rameshrr3
Premium Member
Premium Member
Posts: 609
Joined: Mon May 10, 2004 3:32 am
Location: BRENTWOOD, TN

Post by rameshrr3 »

Create a Pointer to the 'hash' file in VOC file and use an universe editor or SQL 'DELETE ' statement .
If i knew the record id , I would use Universe editor ( ED ) .
Also as ray notes , a hashed file has no business in a parallel forum.

The simplest approach however if you still have the source of this hashed file available - would be to recreate the file ( use 'clear file before write' option ) maybe with a one time job , selecting from source ONLY the records that you really need .
Post Reply