Parameterizing output of Job
Moderators: chulett, rschirm, roy
Parameterizing output of Job
HI,
I am having a scenario like this.
1. Query Oracle Table to get A Date field and keep it as a parameter
2. Use the above parameter in a Where clause when I query Redbrick Tables to get the required columns
As the two table reside in difffernt schemas one in Oracle and anothere in Redbrick I cannot do a join. Options I thought
1. Pass on the Oracle datefield to a sequential file and Open and read it to a parameter before the 2nd job which queries Redbrick and use it in the parameter in the User defined sql
2. Unload Table from Oracle into Redbrick as a work Table.. Work in this table. Do whatever insert/update/delete i have to do and at the end load it back to Oracle. But this is like maintaining a mirror table so i left the idea
I went ahead with the #1 approach but felt it to be a bit lengthy and round about way . Is there any way I can pass a run time value to a parameter in DataStage.
e.g Defined #Datetime# in DSAdmin with some default value and when the first job querying Oracle runs the Output is assigned to #Datetime# parameter and i can use it in later jobs
I am having a scenario like this.
1. Query Oracle Table to get A Date field and keep it as a parameter
2. Use the above parameter in a Where clause when I query Redbrick Tables to get the required columns
As the two table reside in difffernt schemas one in Oracle and anothere in Redbrick I cannot do a join. Options I thought
1. Pass on the Oracle datefield to a sequential file and Open and read it to a parameter before the 2nd job which queries Redbrick and use it in the parameter in the User defined sql
2. Unload Table from Oracle into Redbrick as a work Table.. Work in this table. Do whatever insert/update/delete i have to do and at the end load it back to Oracle. But this is like maintaining a mirror table so i left the idea
I went ahead with the #1 approach but felt it to be a bit lengthy and round about way . Is there any way I can pass a run time value to a parameter in DataStage.
e.g Defined #Datetime# in DSAdmin with some default value and when the first job querying Oracle runs the Output is assigned to #Datetime# parameter and i can use it in later jobs
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Red Brick specialist also
Can you state your actual requirement in terms of what you want to get out of the Red Brick database? We're aware that the date comes from an Oracle database, and can probably manage that in a job parameter, but how do you need to use this date in a Red Brick query?
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
The date field which i want to pass as a run time parameter will be use in the User defined query in ODBC stage where i query redbrick database
something like this
Select
c1,
c2,
..
from Table 1
where Laod_datetime > #param#
The above #Param# is the value i get from Oracle database and want to use while querying tables in redbrick databse
something like this
Select
c1,
c2,
..
from Table 1
where Laod_datetime > #param#
The above #Param# is the value i get from Oracle database and want to use while querying tables in redbrick databse
just make two job,
1) extract oracle database and load you date-value(which you want to pass to
Red bricks query) to Hash file ,
2)define a parameter ,
write query for red bricks in ODBC,pass that parameter in query.
Then create sequence job,
1st job--->routine activity(DSUtilityHashLookUp)----2nd job,
in 2nd job activity pass ,return value from routine activity(Hash look up)
as parameter.
1) extract oracle database and load you date-value(which you want to pass to
Red bricks query) to Hash file ,
2)define a parameter ,
write query for red bricks in ODBC,pass that parameter in query.
Then create sequence job,
1st job--->routine activity(DSUtilityHashLookUp)----2nd job,
in 2nd job activity pass ,return value from routine activity(Hash look up)
as parameter.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
First job selects date from Oracle and loads it into that job's user status area.
Job sequence accesses first job's user status area using the activity variable $UserStatus, in the definition of the second job's job parameter. Type in to that field FirstJob.$UserStatus, where FirstJob is the name of the job activity that executes the first job.
The approach suggested by swades will work, but requires that a hashed file be created (or an existing hashed file used). Using the user status area does not require external hashed file.
Job sequence accesses first job's user status area using the activity variable $UserStatus, in the definition of the second job's job parameter. Type in to that field FirstJob.$UserStatus, where FirstJob is the name of the job activity that executes the first job.
The approach suggested by swades will work, but requires that a hashed file be created (or an existing hashed file used). Using the user status area does not require external hashed file.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
In a job that selects the value from Oracle use an interlude to DSSetUserStatus() function as the derivation in a Transformer stage. Search the forum for sample code.jreddy wrote:Ray, how would be set the value returned from Oracle job/table to the userStatus variable
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.