Need advice on DataStage QualityStage interaction.

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
vmcburney
Participant
Posts: 3593
Joined: Thu Jan 23, 2003 5:25 pm
Location: Australia, Melbourne
Contact:

Need advice on DataStage QualityStage interaction.

Post by vmcburney »

I'm using the QualityStage plugin for DataStage. I need to take a flat file with about 100 fields and no primary key, clean six of the fields in QualityStage then write the result out to a new flat file with the same 100 columns. What is the best way to do this? If I pass just the six columns to QualityStage how do I merge them back into the 100 column format? If I pass all 100 columns to QualityStage doesn't this slow the job down and create a metadata management nightmare? While DataStage can import table definitions in a matter of seconds QualityStage users need to do it manually and are restricted to short column names.

There are several different file formats but the objective is the same, to clean the same six fields from each file.
vmcburney
Participant
Posts: 3593
Joined: Thu Jan 23, 2003 5:25 pm
Location: Australia, Melbourne
Contact:

Post by vmcburney »

Well we came up with a design for anyone interested.
Allocate each row a unique identifier. Pass a few fields and the unique identifier to a QualityStage stage and output the clean values to a hash file. Read the original file in and replace the old fields with clean fields from the hash file using the unique identifier as the key field.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

This isn't a bad idea - it's part of "best practice" for the INTEGRITY product to maintain a cross-reference file containing a (generated) unique key to rows in source data.
vdreddy
Participant
Posts: 5
Joined: Fri Oct 10, 2003 11:32 am

Interaction between DS(PX)7.x , QS?

Post by vdreddy »

Can you advice on how ur doing interaction between DS and QS stages?
1. R u using QS as a stage in the DS?
2. Is there a seperate plugin for using QS in DS?

Your feedback is appreciated.
vmcburney
Participant
Posts: 3593
Joined: Thu Jan 23, 2003 5:25 pm
Location: Australia, Melbourne
Contact:

Post by vmcburney »

We are using the QualityStage plugin for DataStage. DataStage opens a sequential file containing addresses and passes those address fields to QualityStage, it cleans them and maps them to Australian AMAS address standards and passes this address format back, DataStage then writes it out to a sequential file.

The controlling sequence job then calls up an additional QualityStage job by shelling out to a Unix script and executing it from the command line, this job produces match reports on the address file created by the embedded QualityStage job. This is for reporting purposes and was easier then embedding it in DataStage.

In transactional mode the QualityStage plugin operates just like a transformer with rows passing through it.
vdr123
Participant
Posts: 65
Joined: Fri Nov 14, 2003 9:23 am

Post by vdr123 »

Thanks for the feedback...

Do you know if we can use QS plugin for PX jobs...when i tried to add the QS plugin in the PX-job pallette...it shows them on the pallette...BUT THEY ARE DISABLED...

When i open the server-job, i can see them enabled there...

Is the plugin for only DS...is the plugin not compatible with PX jobs???
Post Reply