Joining / doing lookup without a key

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
vnspn
Participant
Posts: 165
Joined: Mon Feb 12, 2007 11:42 am

Joining / doing lookup without a key

Post by vnspn »

Hi,

We have a requirment where we need to bring a value into the primary flow of data from a reference table without matching on a key.

The source is a sequential file. We just need to add an additional column from a reference table into this stream. There is no column on which we could do a match to fetch that reference column. The reference table has only one row and that single value has to be brought into all the rows of the actual stream. We tried lookup stage and lookup stage didn't allow to bring a field from a refernce link without doing a match on a key.

Could anyone please suggest a method to have this done.

Thank you.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Hard-code a static key value into both the reference lookup and the input stream. 'Join' on that.
-craig

"You can never have too many knives" -- Logan Nine Fingers
vnspn
Participant
Posts: 165
Joined: Mon Feb 12, 2007 11:42 am

Post by vnspn »

Thanks for this solution Craig.

We were thinking if there is a way to do it without hard-coding any static key-value. Could there be a solution to use the lookup stage without the key :roll:
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Not that I am aware of, joins need 'keys' / matching columns. Perhaps someone else has some other thoughts on this... ?
-craig

"You can never have too many knives" -- Logan Nine Fingers
MOHAMMAD.ISSAQ
Participant
Posts: 78
Joined: Fri Mar 02, 2007 4:54 am
Location: CHENNAI

Post by MOHAMMAD.ISSAQ »

Hi,
Instead of using the table as reference Why can't you just add an extra column and hardcode the column value of the reference table in the actual stream, since all the rows of the actual stream need this column.


Pls correct me if i'm wrong...
vnspn
Participant
Posts: 165
Joined: Mon Feb 12, 2007 11:42 am

Post by vnspn »

No, that cannot be done. The column we fetch from the Reference table is an ID column. So, the value would be different / change when migrated to different enviornments.

Also, we have to do a similar kind of ID column fetching from different reference tables in different jobs. So, if we hard code in all places, it becomes a too much of hard coding in the overall project. So, we wanted to avoid such a thing.
bcarlson
Premium Member
Premium Member
Posts: 772
Joined: Fri Oct 01, 2004 3:06 pm
Location: Minnesota

Post by bcarlson »

Would the column generator work? How about a parameterized transformer or buildop? The parameter holds the value that is to be placed in the new column.

brad.
kumar_s
Charter Member
Charter Member
Posts: 5245
Joined: Thu Jun 16, 2005 11:00 pm

Post by kumar_s »

i.e., Reading the value from the table and passing it as parameter to the job and substitute in Transformer for the specific fields.
By this way, you are avoiding lookup for each record (Incase you do a sparse lookup).
Impossible doesn't mean 'it is not possible' actually means... 'NOBODY HAS DONE IT SO FAR'
vnspn
Participant
Posts: 165
Joined: Mon Feb 12, 2007 11:42 am

Post by vnspn »

Hi kumar_s,

This looks to be a nice idea. But, how can the output of a SQL statement be passed to a parameter? Can you please give a brief note on how to do that.

Thank you.
kumar_s
Charter Member
Charter Member
Posts: 5245
Joined: Thu Jun 16, 2005 11:00 pm

Post by kumar_s »

Since the Parameter data is very few, you can use a Server job to read the data from Table, and using transformer, UtilityRunJob can be used to call the job and pass the extracted value to the job as parameter.
Alternatively you can Access the data from the table and store it in a file and then read the file via ExecuteComand Activity or any script to pass it.
But the earlier could be simpler.
Impossible doesn't mean 'it is not possible' actually means... 'NOBODY HAS DONE IT SO FAR'
vnspn
Participant
Posts: 165
Joined: Mon Feb 12, 2007 11:42 am

Post by vnspn »

kumar_s wrote:Since the Parameter data is very few, you can use a Server job to read the data from Table, and using transformer, UtilityRunJob can be used to call the job and pass the extracted value to the job as ...
I would like to clarify on what you are saying. You mean to have a separate Server job to run the SQL that fetches the required value. And then call this job from the second job to get the output of that SQL.

In that case, in the first Server job, should I use an Oracle stage to execute the SQL or by some other means.

In the second job, I see that the "UtilityRunJob" cannot be used from a Parallel job Transformer.
Post Reply