writing to database tables using routines
Moderators: chulett, rschirm, roy
-
- Premium Member
- Posts: 133
- Joined: Tue Nov 23, 2004 11:24 pm
- Location: India
writing to database tables using routines
hi all
Problem:
i have some data in a sequential file i want to rite the data present in that in to a odbc table using a basic routine(lots of files and lots of tables didn't want to make my job look complex so trying to achieve this with a routine)
i succeeded in opening the file and reading the data but facing issues in riting it to a odbc table.simple insert command and using it in DSExecute might have done the trick but the catch is i have my oracle instance in a different server.
so i guess i first need to open the connection to the server and find my specified table and insert the data accordingly.
i was allaround the forum searching in the archives with no avail. i found nothing in the documentation as well.
There was something written around the DS_CONNECT command but i couldn't get any information about tht
it would be really helpful if someone can help me with this requirement
Thanks and Regards
Pavan
Problem:
i have some data in a sequential file i want to rite the data present in that in to a odbc table using a basic routine(lots of files and lots of tables didn't want to make my job look complex so trying to achieve this with a routine)
i succeeded in opening the file and reading the data but facing issues in riting it to a odbc table.simple insert command and using it in DSExecute might have done the trick but the catch is i have my oracle instance in a different server.
so i guess i first need to open the connection to the server and find my specified table and insert the data accordingly.
i was allaround the forum searching in the archives with no avail. i found nothing in the documentation as well.
There was something written around the DS_CONNECT command but i couldn't get any information about tht
it would be really helpful if someone can help me with this requirement
Thanks and Regards
Pavan
-
- Premium Member
- Posts: 133
- Joined: Tue Nov 23, 2004 11:24 pm
- Location: India
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Beware that your ODBC connection will only work for 30 days using BCI functions then you will need to purchase a licence for the ODBC driver from DataDirect Technologies.
You can find the BCI manual here
You can find the BCI manual here
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Premium Member
- Posts: 133
- Joined: Tue Nov 23, 2004 11:24 pm
- Location: India
hi ray
Thanx for the pdf. can you please elobrate on "your ODBC connection will only work for 30 days using BCI functions then you will need to purchase a licence for the ODBC driver from DataDirect Technologies".
can you tell me any other solution for my problem?
i mean simpler way of achieving the objective. i don't want to make my job look clumsy with lots of sequential files and lots of odbc stages.
Thanks and Regards
Pavan
Thanx for the pdf. can you please elobrate on "your ODBC connection will only work for 30 days using BCI functions then you will need to purchase a licence for the ODBC driver from DataDirect Technologies".
can you tell me any other solution for my problem?
i mean simpler way of achieving the objective. i don't want to make my job look clumsy with lots of sequential files and lots of odbc stages.
Thanks and Regards
Pavan
-
- Premium Member
- Posts: 133
- Joined: Tue Nov 23, 2004 11:24 pm
- Location: India
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
The ODBC drivers that ship with DataStage are licensed only for use from DataStage stages. They are not licensed for use from code. However, they do work for 30 days (a "trial period"?) after which attempts to connect using them fail, generating a message indicating the need to purchase a licence.
With Wolfgang, I can not see why you can't use DataStage jobs to move the data to and from databases. You can, of course, execute DataStage jobs from routines; I do this a lot, and often only to move a single row. Routines can easily access that single row if it's in a hashed file or a text file, with no problems about licensing.
With Wolfgang, I can not see why you can't use DataStage jobs to move the data to and from databases. You can, of course, execute DataStage jobs from routines; I do this a lot, and often only to move a single row. Routines can easily access that single row if it's in a hashed file or a text file, with no problems about licensing.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Premium Member
- Posts: 133
- Joined: Tue Nov 23, 2004 11:24 pm
- Location: India
hi ray and wolfang
i guess i did not make myself clear with the problem.i will just try to explain in a different way.
i have 20 sequential files with different schemas.
i created 20 tables in my database as i am using a database approach and my model is built around a database approach.
each country where we deploy it will give us the sequential files we got to load the data in to the tables with the existing information of the migrated countries and use the whole data.
To achieve this functionality the different ways i could think is
1st way:
20 different jobs which looks like this:
seqfile---->ODBC Table
2nd way:
1 single job which looks like this:
seqfile1-------->ODBC Table
seqfile2--------->ODBC Table
.
.
.
seqfile20-------->ODBC Table
The design also looks a bit odd in this approach
3rd way:
one routine which will open a sequential file1 and connect to the ODBC respective ODBC Table and fill the Data and i will rite this routine to cater to do this same thing for the rest of the 19 seq files.
the first two ways using the traditional DataStage Stages making the jobs looks too clumsy.so i thought third one is better.
i actually finished riting the routine for the thord one which i will be putting up and asking for ur feedback in a day or two.
If i made myself clear please suggest me any other way of doing it.
Ray can you also please tell me where i ought to buy the license and tentatively how much it will cost us.so that we can atleast make a proposal to the client.
i m sure the client is going to make us stick to 1st or the 2nd approach :D .just wanted to know how much the license cost and where to get it from
waiting for your invaluable suggections
Thanks and Regards
Pavan
i guess i did not make myself clear with the problem.i will just try to explain in a different way.
i have 20 sequential files with different schemas.
i created 20 tables in my database as i am using a database approach and my model is built around a database approach.
each country where we deploy it will give us the sequential files we got to load the data in to the tables with the existing information of the migrated countries and use the whole data.
To achieve this functionality the different ways i could think is
1st way:
20 different jobs which looks like this:
seqfile---->ODBC Table
2nd way:
1 single job which looks like this:
seqfile1-------->ODBC Table
seqfile2--------->ODBC Table
.
.
.
seqfile20-------->ODBC Table
The design also looks a bit odd in this approach
3rd way:
one routine which will open a sequential file1 and connect to the ODBC respective ODBC Table and fill the Data and i will rite this routine to cater to do this same thing for the rest of the 19 seq files.
the first two ways using the traditional DataStage Stages making the jobs looks too clumsy.so i thought third one is better.
i actually finished riting the routine for the thord one which i will be putting up and asking for ur feedback in a day or two.
If i made myself clear please suggest me any other way of doing it.
Ray can you also please tell me where i ought to buy the license and tentatively how much it will cost us.so that we can atleast make a proposal to the client.
i m sure the client is going to make us stick to 1st or the 2nd approach :D .just wanted to know how much the license cost and where to get it from
waiting for your invaluable suggections
Thanks and Regards
Pavan
-
- Participant
- Posts: 3593
- Joined: Thu Jan 23, 2003 5:25 pm
- Location: Australia, Melbourne
- Contact:
You are effectively bypassing DataStage and forcing the data straight into the tables without much care for transformation or metadata management. The correct design in DataStage is to have a job per table definition, so 20 jobs instead of 1 routine. Since you are not doing any complex transformations the building of these jobs will be very fast, faster then trying to build your own ETL engine in a BASIC routine.
If you try to get the single routine for all table definitions approach working then you are going to have to maintain the file descriptions and then you have the data type mismatch problems and unexpected data content without the benefits you get in DataStage of warnings, rejects and array sizing. You lose the benefits of importing sequential file and table definitions and using them in jobs.
If you try to get the single routine for all table definitions approach working then you are going to have to maintain the file descriptions and then you have the data type mismatch problems and unexpected data content without the benefits you get in DataStage of warnings, rejects and array sizing. You lose the benefits of importing sequential file and table definitions and using them in jobs.
Last edited by vmcburney on Thu Jan 06, 2005 10:23 pm, edited 1 time in total.
Certus Solutions
Blog: Tooling Around in the InfoSphere
Twitter: @vmcburney
LinkedIn:Vincent McBurney LinkedIn
Blog: Tooling Around in the InfoSphere
Twitter: @vmcburney
LinkedIn:Vincent McBurney LinkedIn
You need to state your database. If you're using Oracle, then you don't even need to use DataStage. You need to learn sqlldr. It has every piece of functionality you have described, plus it uses native connectivity and has the highest performing loading rate for Oracle.
You can use sqlldr to create tables, you can dynamically write the mapping ("a .ctl file") document that describes which source columns map to target. You can derive columns, do math, etc. You have restart capability, plus you have techniques for inserting and updating (merge).
You need to forget that you have DataStage. You are trying to use a Space Shuttle as a submarine. You need the right tool for the job, and DATASTAGE IS NOT IT. It is an ETL tool, with larger considerations (metadata, graphical paradigm, etc). Your considerations are too far from what an ETL tool does. This is not criticism of you, it is a peer-to-peer recommendation to use a more appropriate tool.
You can use sqlldr to create tables, you can dynamically write the mapping ("a .ctl file") document that describes which source columns map to target. You can derive columns, do math, etc. You have restart capability, plus you have techniques for inserting and updating (merge).
You need to forget that you have DataStage. You are trying to use a Space Shuttle as a submarine. You need the right tool for the job, and DATASTAGE IS NOT IT. It is an ETL tool, with larger considerations (metadata, graphical paradigm, etc). Your considerations are too far from what an ETL tool does. This is not criticism of you, it is a peer-to-peer recommendation to use a more appropriate tool.
Kenneth Bland
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
The vendor is MainSoft(formerly known as Data Direct - I do not have current pricing information, but can inform you that it's per CPU in your machine, and expect tens of thousands of US dollars.
Search the Forum for examples of code using BCI functions - there are quite a few that have been posted over the years.
All that said, I'd still advocate calling a small, separate DataStage job from your routine, and have that job move the text to the table. (Plus there's no licensing issues!)
Search the Forum for examples of code using BCI functions - there are quite a few that have been posted over the years.
All that said, I'd still advocate calling a small, separate DataStage job from your routine, and have that job move the text to the table. (Plus there's no licensing issues!)
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Participant
- Posts: 3593
- Joined: Thu Jan 23, 2003 5:25 pm
- Location: Australia, Melbourne
- Contact:
Exactly. Your 3rd way would be easier to do in Visual Basic as it is a more user friendly programming tool. I would estimate that 1st way would take about 2 days, 2nd way 3-5 days because you could end up debugging performance and memory problems, 3rd way 10 days because you are manually entering your metadata and coding your own error handling and data type conversions.
Certus Solutions
Blog: Tooling Around in the InfoSphere
Twitter: @vmcburney
LinkedIn:Vincent McBurney LinkedIn
Blog: Tooling Around in the InfoSphere
Twitter: @vmcburney
LinkedIn:Vincent McBurney LinkedIn
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact: