Shared container

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
reddy
Premium Member
Premium Member
Posts: 168
Joined: Tue Dec 07, 2004 12:54 pm

Shared container

Post by reddy »

Hello Sir,

I created a job to create a hash file after that i want to make it as a shared container because i am going to use it in 20 more jobs,so i went for server shared container there i copied this job and highlited other than
output link and clicked shared container.but when i am going to use in another job i am getting error message like output link referenced in shared as stream link and in this job referencing as reference link.

please help me out how to prepare shared container in step by step .

Thanks
Narsa
ds_developer
Premium Member
Premium Member
Posts: 224
Joined: Tue Sep 24, 2002 7:32 am
Location: Denver, CO USA

Post by ds_developer »

If all you want is to use this hash file as a lookup in other jobs then you don't need a shared container. Just add a hash file stage to those jobs and add the name and columns (Table Definition). The error is referring to the difference between a solid link (a stream) and a dashed link (a reference).

The Transformer stage only supports one stream link, but multiple reference links.

It looks like you built the hash file using a stream and now are trying to use it as a reference.

Hope this helps,
John
reddy
Premium Member
Premium Member
Posts: 168
Joined: Tue Dec 07, 2004 12:54 pm

Post by reddy »

Thanks for quick response.

But i have to create shared container as per my specifications.

I created a job with final output as a hash file,so for reusuability i need to create that job as a shared container for that i copied this job and pasted into a served shared container palatte.

when i am trying to use it in main job and getting compilation error like this

created hash file with stream link in shared container but using it as

reference link.

Please help me out.

Thanks
Narsa
ds_developer
Premium Member
Premium Member
Posts: 224
Joined: Tue Sep 24, 2002 7:32 am
Location: Denver, CO USA

Post by ds_developer »

1. create a new server job in Designer
2. add a hash file stage to it
3. draw a link from the hash file stage to anywhere
4. click on the link and go right mouse "Convert to reference"
5. load the hash file stage with the same information you used when loading the hash file
6. highlight the hash file stage and use Edit | Construct Container |Shared

Should work,
John
reddy
Premium Member
Premium Member
Posts: 168
Joined: Tue Dec 07, 2004 12:54 pm

Post by reddy »

Hello sir,

My situation is like this sir.

I created a simple job just loading product table into a hash file for weekly load jobs.

Job : ODBC stage + Transformer stage = hash file.

I highlighted ODBC stage + Transformer stage and created shared container.

After that i used this shared container in my main job as lookup file.

While compilation i am getting error like this :

The container stage link is a reference link but its mapped link in the shared container is a stream link .


Please help me out how to overcome this problem.

Thanks in advance.

Narasa
ds_developer
Premium Member
Premium Member
Posts: 224
Joined: Tue Sep 24, 2002 7:32 am
Location: Denver, CO USA

Post by ds_developer »

Do you want to reload the product data everytime you use the shared container? I doubt it - that is why you wrote it to a hash file. As I see it, you want to build the hash file once (a day/week/...) and use it several times as a shared container. If this is true, then do the steps I outlined in my earlier post.

John
reddy
Premium Member
Premium Member
Posts: 168
Joined: Tue Dec 07, 2004 12:54 pm

Post by reddy »

Hi Guys,

Please help me Datastage gurus:

I created job like this

ODBC -----------> Trabsformer ------------> Hash file (Output stage)



I made it as Shared container

I used this shared container in my main job as a reference (reference link). but i created it as a stream link......

so i am getting erroe.

Please help me what i did wrong ...


Thanks
Narsa
pnchowdary
Participant
Posts: 232
Joined: Sat May 07, 2005 2:49 pm
Location: USA

Post by pnchowdary »

reddy wrote:Hi Guys,

I created job like this

ODBC -----------> Trabsformer ------------> Hash file (Output stage)


ODBC -------------> Tranformer -------------> Hash File ------------>

L1 is the link between ODBC and Tranformer
L2 is the link between Transformer and Hash File
L3 is the link from the Hash file to the output

I believe that you have all the links L1,L2,L3 as stream links (solid arrow)
To get rid of your error, right click on link L3 and from the popup menu, press convert to reference.

This should take care of your problem. Let me know whether it worked for ya.
Post Reply