Need help with universe metadata!!!
Moderators: chulett, rschirm, roy
Need help with universe metadata!!!
Hi,
I want to read the datastage repository tables. To be able to do so i would need the table definitions of the universe database tables. How can i import the metadata of all the tables in the universe repository of datastage?
I want to read the datastage repository tables. To be able to do so i would need the table definitions of the universe database tables. How can i import the metadata of all the tables in the universe repository of datastage?
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
That information is the proprietary intellectual property of Ascential Software and is not in the public domain.
The repository tables are, mainly, published deliberately without column definitions, being intended only to be accessed programmatically.
Where Ascential believes it is suitable that you know, the column definitions are published. You can access them from the command line using the LIST.DICT command.
However, when importing UniVerse (hashed) file definitions, the Repository tables' names are deliberately suppressed from the list of available tables. It is not possible to override this "feature".
With a little UniVerse knowledge you can expose them through parallel table definitions sharing the same file dictionaries. But you will never get the complete definitions of all columns. Not least because there are multiple record formats in most of the repository tables; it is definitely not a first normal form environment!
What are you trying to accomplish? That is, why do you believe you need these table definitions? There may be alternatives, such as extracting data from a MetaStage hub, or from the documentation tool database (which IS first normal form).
The repository tables are, mainly, published deliberately without column definitions, being intended only to be accessed programmatically.
Where Ascential believes it is suitable that you know, the column definitions are published. You can access them from the command line using the LIST.DICT command.
However, when importing UniVerse (hashed) file definitions, the Repository tables' names are deliberately suppressed from the list of available tables. It is not possible to override this "feature".
With a little UniVerse knowledge you can expose them through parallel table definitions sharing the same file dictionaries. But you will never get the complete definitions of all columns. Not least because there are multiple record formats in most of the repository tables; it is definitely not a first normal form environment!
What are you trying to accomplish? That is, why do you believe you need these table definitions? There may be alternatives, such as extracting data from a MetaStage hub, or from the documentation tool database (which IS first normal form).
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
And keep in mind that for any given job/routine/transformer/tabledef type for a given release the same fields/columns are used for different things (overloaded) so reverse engineering could be painful and very innacurate.
Also, some fields are bitmaps and unless you understand how to decipher the bitmap this information would be meaningless when viewed in the editor or DSX.
Regards,
Also, some fields are bitmaps and unless you understand how to decipher the bitmap this information would be meaningless when viewed in the editor or DSX.
Regards,
Mike Hester
mhester@petra-ps.com
mhester@petra-ps.com
-
- Participant
- Posts: 214
- Joined: Mon Feb 23, 2004 2:10 am
- Location: Dublin, Ireland
- Contact:
Hi dwscblr,
Ascential have a product for this, it's called MetaStage. It migrates all DS data into a relational database as well as a lot of other stuff as well.
If you are brilliant, like Tom Nell you might even be able to understand the data model of MetaStage and get stuff out of it. Or you can just contact Tom to write the queries you want.....
I would not expect any reputable consultant to assist you to steal Ascential IP in lieu of buying the product they have built to do what you want to do, which is what you are asking for.
Ascential have a product for this, it's called MetaStage. It migrates all DS data into a relational database as well as a lot of other stuff as well.
If you are brilliant, like Tom Nell you might even be able to understand the data model of MetaStage and get stuff out of it. Or you can just contact Tom to write the queries you want.....
I would not expect any reputable consultant to assist you to steal Ascential IP in lieu of buying the product they have built to do what you want to do, which is what you are asking for.
dwscblr wrote:I want to replicate the universe repository into another Relational DB which can be anything... Oracle, SQL Server etc. Thus to be able to do that i need to read the universe repository tables. Is there any other way in which i can do that?
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
The Documentation Tool (from Manager) will do this for you.dwscblr wrote:I want to replicate the universe repository into another Relational DB which can be anything... Oracle, SQL Server etc. Thus to be able to do that i need to read the universe repository tables. Is there any other way in which i can do that?
C:\Program Files\Ascential\DataStage\Documentation Tool contains scripts for setting up the required tables in Oracle, SQL Server, etc., called oracle.sql, mssql.sql, etc.
You also set up an ODBC DSN to point to your database.
When using the Documentation Tool, go to the Options tab and change the DSN from DSReporting (which points to doc_tool.mdb) to your DSN, and away you go!
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Participant
- Posts: 3593
- Joined: Thu Jan 23, 2003 5:25 pm
- Location: Australia, Melbourne
- Contact:
You can also twiddle your thumbs for twelve months because by the end of 2005 the Ascential suite will be on a database independent repository with support for SQL Server, Oracle or DB2. I think Ascential have been taking flack for years from Informatica in vendor comparisons because Informatica has an open repository and DataStage doesn't.
Certus Solutions
Blog: Tooling Around in the InfoSphere
Twitter: @vmcburney
LinkedIn:Vincent McBurney LinkedIn
Blog: Tooling Around in the InfoSphere
Twitter: @vmcburney
LinkedIn:Vincent McBurney LinkedIn
All "they" said at Ascential World was that it will be an "open" architecture, but they gave nothing more than that - no details.
Mike Hester
mhester@petra-ps.com
mhester@petra-ps.com
-
- Participant
- Posts: 3593
- Joined: Thu Jan 23, 2003 5:25 pm
- Location: Australia, Melbourne
- Contact:
They said that out of the box you could install it on MySQL, Oracle, SQL Server or DB2 and Sybase were negotiating to get added to that. For other databases you probably have to create your own data structure. MySQL would be free, the others would require a licensed RDBMS. Now unless you know how to hide the metadata on these database types you've got to assume the entire table structure and column structure will be exposed. These table relationships can be very complex, just try reporting on derivations in the DataStage reporting tool.
Certus Solutions
Blog: Tooling Around in the InfoSphere
Twitter: @vmcburney
LinkedIn:Vincent McBurney LinkedIn
Blog: Tooling Around in the InfoSphere
Twitter: @vmcburney
LinkedIn:Vincent McBurney LinkedIn
hi..
u can create routine..to read all the colums name, length and scale off the Tebledefinations..
try this routine it may help...
here in discussion i am not clear wht exactly u want ..bcoz after my first reply yesterday....today ..discussion bcomes..very long..
any way..
try this
OPEN "DS_METADATA" TO MetaDataFile ELSE
Call DSLogWarn("Error opening Metadata Library file - DS_METADATA")
END
READ SrcRec FROM MetaDataFile,CategoryName Then
SrcColCnt = DCOUNT(SrcRec<12>,@VM)
ColumnFound = 0
For i=1 to SrcColCnt
MColName = SrcRec<12,i>
MTypeCode = SrcRec<15,i>
MLength = SrcRec<16,i>
MScale = SrcRec<17,i>
MNullable = SrcRec<18,i>
Mkey = SrcRec<19,i>
[/quote]
u can create routine..to read all the colums name, length and scale off the Tebledefinations..
try this routine it may help...
here in discussion i am not clear wht exactly u want ..bcoz after my first reply yesterday....today ..discussion bcomes..very long..
any way..
try this
OPEN "DS_METADATA" TO MetaDataFile ELSE
Call DSLogWarn("Error opening Metadata Library file - DS_METADATA")
END
READ SrcRec FROM MetaDataFile,CategoryName Then
SrcColCnt = DCOUNT(SrcRec<12>,@VM)
ColumnFound = 0
For i=1 to SrcColCnt
MColName = SrcRec<12,i>
MTypeCode = SrcRec<15,i>
MLength = SrcRec<16,i>
MScale = SrcRec<17,i>
MNullable = SrcRec<18,i>
Mkey = SrcRec<19,i>
[/quote]
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact: