writing to database tables using routines

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
Pavan_Yelugula
Premium Member
Premium Member
Posts: 133
Joined: Tue Nov 23, 2004 11:24 pm
Location: India

writing to database tables using routines

Post by Pavan_Yelugula »

hi all

Problem:
i have some data in a sequential file i want to rite the data present in that in to a odbc table using a basic routine(lots of files and lots of tables didn't want to make my job look complex so trying to achieve this with a routine)

i succeeded in opening the file and reading the data but facing issues in riting it to a odbc table.simple insert command and using it in DSExecute might have done the trick but the catch is i have my oracle instance in a different server.
so i guess i first need to open the connection to the server and find my specified table and insert the data accordingly.

i was allaround the forum searching in the archives with no avail. i found nothing in the documentation as well.
There was something written around the DS_CONNECT command but i couldn't get any information about tht
it would be really helpful if someone can help me with this requirement

Thanks and Regards
Pavan
Pavan_Yelugula
Premium Member
Premium Member
Posts: 133
Joined: Tue Nov 23, 2004 11:24 pm
Location: India

Post by Pavan_Yelugula »

hi
can any please share with me the below pdf
UniVerse BASIC SQL Client Interface Guige (BCI.pdf). the archives says i can find what i am looking for in these pdf's

Thanks and Regards
Pavan
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Beware that your ODBC connection will only work for 30 days using BCI functions then you will need to purchase a licence for the ODBC driver from DataDirect Technologies.

You can find the BCI manual here
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Pavan_Yelugula
Premium Member
Premium Member
Posts: 133
Joined: Tue Nov 23, 2004 11:24 pm
Location: India

Post by Pavan_Yelugula »

hi ray
Thanx for the pdf. can you please elobrate on "your ODBC connection will only work for 30 days using BCI functions then you will need to purchase a licence for the ODBC driver from DataDirect Technologies".

can you tell me any other solution for my problem?

i mean simpler way of achieving the objective. i don't want to make my job look clumsy with lots of sequential files and lots of odbc stages.

Thanks and Regards
Pavan
WoMaWil
Participant
Posts: 482
Joined: Thu Mar 13, 2003 7:17 am
Location: Amsterdam

Post by WoMaWil »

Hi Pavan,

write a routine for to write your lots of single seqfiles to one seq-File (append) and one file after the other never several at the same time.

Then use a DataStage-Job to write from that one file to your table.

Wolfgang
Pavan_Yelugula
Premium Member
Premium Member
Posts: 133
Joined: Tue Nov 23, 2004 11:24 pm
Location: India

Post by Pavan_Yelugula »

hey
i have lots of tables as well. i am not targetting a single table.

Thanks and Regards
Pavan
WoMaWil
Participant
Posts: 482
Joined: Thu Mar 13, 2003 7:17 am
Location: Amsterdam

Post by WoMaWil »

I don't see what you win by not using DataStage Jobs. Do your Tables have different Meta-Data or are they about the same?

Wolfgang
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

The ODBC drivers that ship with DataStage are licensed only for use from DataStage stages. They are not licensed for use from code. However, they do work for 30 days (a "trial period"?) after which attempts to connect using them fail, generating a message indicating the need to purchase a licence.

With Wolfgang, I can not see why you can't use DataStage jobs to move the data to and from databases. You can, of course, execute DataStage jobs from routines; I do this a lot, and often only to move a single row. Routines can easily access that single row if it's in a hashed file or a text file, with no problems about licensing.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Pavan_Yelugula
Premium Member
Premium Member
Posts: 133
Joined: Tue Nov 23, 2004 11:24 pm
Location: India

Post by Pavan_Yelugula »

hi ray and wolfang
i guess i did not make myself clear with the problem.i will just try to explain in a different way.

i have 20 sequential files with different schemas.
i created 20 tables in my database as i am using a database approach and my model is built around a database approach.
each country where we deploy it will give us the sequential files we got to load the data in to the tables with the existing information of the migrated countries and use the whole data.

To achieve this functionality the different ways i could think is

1st way:
20 different jobs which looks like this:
seqfile---->ODBC Table

2nd way:
1 single job which looks like this:
seqfile1-------->ODBC Table
seqfile2--------->ODBC Table
.
.
.
seqfile20-------->ODBC Table

The design also looks a bit odd in this approach

3rd way:
one routine which will open a sequential file1 and connect to the ODBC respective ODBC Table and fill the Data and i will rite this routine to cater to do this same thing for the rest of the 19 seq files.

the first two ways using the traditional DataStage Stages making the jobs looks too clumsy.so i thought third one is better.

i actually finished riting the routine for the thord one which i will be putting up and asking for ur feedback in a day or two.

If i made myself clear please suggest me any other way of doing it.

Ray can you also please tell me where i ought to buy the license and tentatively how much it will cost us.so that we can atleast make a proposal to the client.
i m sure the client is going to make us stick to 1st or the 2nd approach :D .just wanted to know how much the license cost and where to get it from

waiting for your invaluable suggections
Thanks and Regards
Pavan
vmcburney
Participant
Posts: 3593
Joined: Thu Jan 23, 2003 5:25 pm
Location: Australia, Melbourne
Contact:

Post by vmcburney »

You are effectively bypassing DataStage and forcing the data straight into the tables without much care for transformation or metadata management. The correct design in DataStage is to have a job per table definition, so 20 jobs instead of 1 routine. Since you are not doing any complex transformations the building of these jobs will be very fast, faster then trying to build your own ETL engine in a BASIC routine.

If you try to get the single routine for all table definitions approach working then you are going to have to maintain the file descriptions and then you have the data type mismatch problems and unexpected data content without the benefits you get in DataStage of warnings, rejects and array sizing. You lose the benefits of importing sequential file and table definitions and using them in jobs.
Last edited by vmcburney on Thu Jan 06, 2005 10:23 pm, edited 1 time in total.
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

You need to state your database. If you're using Oracle, then you don't even need to use DataStage. You need to learn sqlldr. It has every piece of functionality you have described, plus it uses native connectivity and has the highest performing loading rate for Oracle.

You can use sqlldr to create tables, you can dynamically write the mapping ("a .ctl file") document that describes which source columns map to target. You can derive columns, do math, etc. You have restart capability, plus you have techniques for inserting and updating (merge).

You need to forget that you have DataStage. You are trying to use a Space Shuttle as a submarine. You need the right tool for the job, and DATASTAGE IS NOT IT. It is an ETL tool, with larger considerations (metadata, graphical paradigm, etc). Your considerations are too far from what an ETL tool does. This is not criticism of you, it is a peer-to-peer recommendation to use a more appropriate tool.
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

The vendor is MainSoft(formerly known as Data Direct - I do not have current pricing information, but can inform you that it's per CPU in your machine, and expect tens of thousands of US dollars.

Search the Forum for examples of code using BCI functions - there are quite a few that have been posted over the years.

All that said, I'd still advocate calling a small, separate DataStage job from your routine, and have that job move the text to the table. (Plus there's no licensing issues!)
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
vmcburney
Participant
Posts: 3593
Joined: Thu Jan 23, 2003 5:25 pm
Location: Australia, Melbourne
Contact:

Post by vmcburney »

Exactly. Your 3rd way would be easier to do in Visual Basic as it is a more user friendly programming tool. I would estimate that 1st way would take about 2 days, 2nd way 3-5 days because you could end up debugging performance and memory problems, 3rd way 10 days because you are manually entering your metadata and coding your own error handling and data type conversions.
alvarez-m
Participant
Posts: 13
Joined: Tue Nov 16, 2004 3:12 pm

Post by alvarez-m »

Hi Ray, Pavan

I am looking for information on BCI. Would you be so kind to send me the BCI.pdf?

Max.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Follow the link in the third post on this thread. 8)
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply