Lookup issue

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

svkworld
Premium Member
Premium Member
Posts: 39
Joined: Mon Nov 21, 2005 10:08 am

Lookup issue

Post by svkworld »

I've a source table such as follows:

id, date_of_service

The lookup table has following information

id, eff_date, term_date

I would like to retrieve matching row from lookup table. where id=id and date_of_service BETWEEN eff_date and term_date. I am unable to code the 2nd condition in datastage. Any help is greatly appreciated! (including pointing to right threads :) )
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Search for 'range lookup' amongst the forums and you shall find what you seek, Pilgrim. :wink:
-craig

"You can never have too many knives" -- Logan Nine Fingers
svkworld
Premium Member
Premium Member
Posts: 39
Joined: Mon Nov 21, 2005 10:08 am

Post by svkworld »

Thnaks, I searched the forum and follow the steps accordingly to use the UV stage for look up. But when I specify the hashed file name as table name and do a view data it says "table doesnot exist."

Any ideas?
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Are you certain that the hashed file exists (define it in a job, use "view data" in a Hashed file stage)? If so, how did you declare it as a UV stage?
svkworld
Premium Member
Premium Member
Posts: 39
Joined: Mon Nov 21, 2005 10:08 am

Post by svkworld »

Yes, I did create the HASH File as a seperate job and I am able to view data.
svkworld
Premium Member
Premium Member
Posts: 39
Joined: Mon Nov 21, 2005 10:08 am

Post by svkworld »

Yes, I did create the HASH File as a seperate job and I am able to view data from this job. It throws me an error only when I try to view it from the UV Stage.
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Well, then how did you define it in the UV stage? with which connection?
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

If you created the hashed file 'pathed' in a directory rather than the account, you need to create a VOC pointer to it before you can use it with the UV stage.
-craig

"You can never have too many knives" -- Logan Nine Fingers
svkworld
Premium Member
Premium Member
Posts: 39
Joined: Mon Nov 21, 2005 10:08 am

Post by svkworld »

I created the hashed file in the account not in apathed directory.

And in the UV stage I specified DataSource as localuv and imported the hashed file table definition.

What am I missing?
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Try changing the table to "VOC" and do a view data on it? Do you get the same error message or one stating bad column names?
irajasekharhexa
Premium Member
Premium Member
Posts: 82
Joined: Fri Jun 03, 2005 5:23 am
Location: Bangalore
Contact:

Post by irajasekharhexa »

ArndW wrote:Try changing the table to "VOC" and do a view data on it? Do you get the same error message or one stating bad column names?
Why con't we use the Transformation stage and stage vatriables with logical operators to find out the date range like Actualdate>=rangedate1 and Actualdate<=rangedate1. And then pass it to next stage.
Rajasekhar
svkworld
Premium Member
Premium Member
Posts: 39
Joined: Mon Nov 21, 2005 10:08 am

Post by svkworld »

I entered VOC in the table name and this is the error I got:

UniVerse_27.DSLink19: DSD.BCIOpenR call to SQLExecDirect failed.

SQLSTATE=S1000, DBMS.CODE=950088
[DataStage][SQL Client][UNIVERSE]DataStage/SQL: syntax error. Unexpected symbol. Token was "ID". Scanned command was FROM VOC SELECT ID
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

That is a good message, it means the VOC is accessible. It also means that you have not declared your hashed file as a local one.
svkworld
Premium Member
Premium Member
Posts: 39
Joined: Mon Nov 21, 2005 10:08 am

Post by svkworld »

This is what I have done in the job that loads to the Hashed file:

ODBC --> Xformer--> Hashed file

In the General tab of the Hashed file stage I selected the Use Account name and picked the one I want from the drop down list(this is where I created teh table definition in the Manager as well).

In the Inputs tab I selected the file name from teh drop down list, which I had created in the Manager under Table definition.

Did I miss anything or go wrong somewhere?
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

svkworld wrote:In the General tab of the Hashed file stage I selected the Use Account name and picked the one I want from the drop down list(this is where I created teh table definition in the Manager as well).
What actual value did you pick? This locks your hashed file to always live in that account and not move up along with the jobs to other Projects - test / qa / production / etc.

The vast majority of the time the account should be left blank to mean the 'current' account. Try changing that and rerunning your hashed file creation job.
-craig

"You can never have too many knives" -- Logan Nine Fingers
Post Reply