general validations

A forum for discussing DataStage<sup>®</sup> basics. If you're not sure where your question goes, start here.

Moderators: chulett, rschirm, roy

Post Reply
Cr.Cezon
Participant
Posts: 101
Joined: Mon Mar 05, 2007 4:59 am
Location: Madrid

Post by Cr.Cezon »

you can use a build-op stage
Sudhindra_ps
Participant
Posts: 45
Joined: Thu Aug 31, 2006 3:13 am
Location: Bangalore

Post by Sudhindra_ps »

hi,

You could achieve this by designing an Shared container job. But it all depends on your metadata you are processing on a job by job basis. It purely depends up on your business requirement and job design. The types of validations you have mentioned in your questionairre can be handled quite easily using Transformer stage.

Thanks & regards
Sudhindra P S
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

1) Individually per field. RI checks using a lookup against (an image of) the target parent table.

2) Not null. Possibly replacing null with default value ("in-band null").
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
prasannak
Premium Member
Premium Member
Posts: 56
Joined: Thu Mar 20, 2008 9:45 pm
Contact:

Post by prasannak »

Hi,

Thanks for all the suggestions...
I was thinking more in terms of having a parameter table with these columns and somehow determine a grouping criteria to group these columns so that they can be queried from an oracle stage
and a transformer can be used to perform these validations...The validation rules will be defined in the parameter table per column.
The job thus created can be made sharable so that anybody requesting certain column validations will just have to include this job in their design...Basically , a plug and play component...
This way, I could ensure a generic parameter driven solution.
If an additional column is needed to be validated in future, Then all I will need to do is to add that entry in the table and modify this common job.

Does this make sense in terms of a modular parameter driven approach?
Any suggestions would be greatly appreciated...
Being a ds newbie, I might way off in accessing ds capabilities here...so, correct me if my assumptions are wrong...It just would be learnings opportunity for me...


Ray,
I did not understand the first reply about the image...
Also, how deep can one in terms of table driven parameter solutions in datastage or are there some better approaches...?
Our architecture calls for creating and maintaining so many different parameter tables (for lookups,transform rules, error codes,job status, etc etc ...all amounting to about 15 tables...)...
Which makes me wonder about too much parameterizing ?
Is this how it is generally done in DS for a big project from a manageability standpoint?


Thx
Post Reply