Page 1 of 2

Posted: Thu Nov 20, 2008 11:48 pm
by basu.ds
use schema files and parameterised the job

Posted: Fri Nov 21, 2008 12:22 am
by jaysheel
Thanks for the relpy.

Could you please elaborate on that.

Thanks

Posted: Fri Nov 21, 2008 12:57 am
by jaysheel
Can anyone suggest me something on this..?
Ray, expecting a reply from you :)

Posted: Fri Nov 21, 2008 3:17 am
by ray.wurlod
Ten separate jobs, each using correct metadata, would be my preferred approach - easier to maintain when things change.

Posted: Fri Nov 21, 2008 4:00 am
by jaysheel
Thats right Ray.. But here the challenge is to reuse the job for 10 different tables. We are doing a research on this. If reusability is possible then could you tell me the way to approach it ?


Thanks,

Posted: Fri Nov 21, 2008 5:28 am
by mdbatra
I think basu.ds has already provided u the solution:
Use schema file to pass metadata & give parameters for Input file,Target Table & Schema file !

Posted: Fri Nov 21, 2008 6:31 am
by Karthi_sk
Hi,
I am also having a similar issue.
I tried using the schema file as a parameter for the sequential file stage(which is the source) and it works perfectly.
The problem starts when i try to parameterise the target table, as i dont see any option to use a schema file in any of the DBMS stages.
That means i am able to parameterise the source file name, souce file definition and the target name, but not the target table definition and under these conditions my metadata for the source and target are not matching.

Posted: Fri Nov 21, 2008 7:26 am
by Mike
Ray's preferred approach would also be my preferred approach.

Maintainability is an important factor. Having the metadata for impact analysis and data lineage is extremely important.

It's true that Ab Initio can do this easily with a parameterized graph, but if an Ab Initio developer took the time to make sure that data lineage and impact analysis was not broken... Instead they generally don't take care of the metadata, which is one reason (in my opinion) that Ab Initio's metadata solution is so weak.

Mike

Posted: Sat Nov 22, 2008 5:39 am
by mdbatra
Karthi...
When we have RCP enabled + providing the parameters for Input Data File, Schema File, Target Table Name...no need of worrying about
Target Table defination. It'll get populated at run time.

Posted: Sat Nov 22, 2008 7:55 am
by Mike
Does anyone know what MDB's suggestion does to impact analysis and data lineage? Is it broken? Can it be easily fixed? If it is broken and takes anything more than trivial developer effort to fix, then I personally will avoid this technique. I place much greater value on metadata and maintainability than I do on a minor productivity gain.

Mike

Posted: Sat Nov 22, 2008 8:14 am
by mdbatra
Mike..
With all regards, i totally accord with the fact of making 10 Different jobs for "impact analysis and data lineage".
But my post was mistaken by you perhaps. It was just to let "Karthi" know, who was trying to acheive re-usability(in a single job), how to do that irrespective of the Maintenance factor.

& Karthi, i trust you must have got what you need to do :D

Posted: Sat Nov 22, 2008 8:43 am
by Mike
MDB,

To the contrary, I really appreciate the suggestion that you offered. For my own education, I just want to understand the pros and cons of different design alternatives. I don't have a good feel for the current metadata capabilities in Information Server and haven't done anything with metadata since my return to the DataStage world. I suspect the "utility" approach would break data lineage and impact analysis in IIS as well... just want to confirm that suspicion and see if it is "repairable".

Mike

Posted: Sat Nov 22, 2008 9:15 am
by mdbatra
Nothing denying the fact that "Utility" approach will surely hit the Maintainabilty of the Metadata. Also,despite of my little age in DataStage(2 years), i never saw a practice which gave priviledge to the former, it has always been the latter who got it.

Posted: Sat Nov 22, 2008 9:20 am
by mdbatra
Also,regarding repairability,in our DW/BI arena, we do not mind doing a little more & sophisticated work but yes , we do mind reworking, infact a lot. That's what i learnt.
People may differ!

Posted: Sat Nov 22, 2008 9:21 am
by mdbatra
Also,regarding repairability,in our DW/BI arena, we do not mind doing a little more & sophisticated work but yes , we do mind reworking, infact a lot. That's what i learnt.
People may differ!