Link Collector

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
DS_MJ
Participant
Posts: 157
Joined: Wed Feb 02, 2005 10:00 am

Link Collector

Post by DS_MJ »

Hello:

I am doing a job on Server 7.5 from Windows. The job consists of shared containers, hash lookups, link collector/partitoner and sequential files as source and target files.

When I compile this job I get an error "Link Collector stage does not support in-process active-to-active inputs or outputs and the job name."

Have transformed the transformer into a shared container in this job and using this shared container 4 times in this job. Need to use the shared container instead of transformer.

What else can I use cause need the shared container and the collector to do the job efficiently.

Would appreciate any advice at your earliest.

Thanks in advance.

DS_MJ
garthmac
Charter Member
Charter Member
Posts: 55
Joined: Tue Oct 21, 2003 9:17 am
Location: UK

Post by garthmac »

Hi,

For link collectors, my understanding is that you need to have sequential files (or hash files) as your source and targets. You will get an error if you try to collect your data in a transformer. This is probably the source of your error.
Sainath.Srinivasan
Participant
Posts: 3337
Joined: Mon Jan 17, 2005 4:49 am
Location: United Kingdom

Post by Sainath.Srinivasan »

Link collector must be attached to a source file directly and not to any dynamic stage such as Transformer. If your source is not static, store the result in a sequential file and connect it to your link collector.
manteena
Premium Member
Premium Member
Posts: 38
Joined: Thu Feb 10, 2005 1:43 pm
Location: USA

Re: Link Collector

Post by manteena »

DS_MJ wrote:Hello:

I am doing a job on Server 7.5 from Windows. The job consists of shared containers, hash lookups, link collector/partitoner and sequential files as source and target files.

When I compile this job I get an error "Link Collector stage does not support in-process active-to-active inputs or outputs and the job name."

Have transformed the transformer into a shared container in this job and using this shared container 4 times in this job. Need to use the shared container instead of transformer.

What else can I use cause need the shared container and the collector to do the job efficiently.

Would appreciate any advice at your earliest.

Thanks in advance.

DS_MJ
I Prefer Inter Process stages before the collection,For improved performaces.
DS_MJ
Participant
Posts: 157
Joined: Wed Feb 02, 2005 10:00 am

Link Collector Error while compiling

Post by DS_MJ »

Hello:

Thank you for your replies.

However, my link partitioner is directly connected to my source sequential file and my Link Collector is directly connected to my target Sequential files. My hash file lookup's go into the shared container and from shared container it goes into Link collector and from link collector to my target sequential files. So:

INPUT Sequence:

Source Sequential file--> Link Partitioner(Round Robin) --> Shared container's <-- Hash lookup's going into Shared container.

OUTPUT sequence:

Hash lookup's --->Shared container's --> Link collector(Round Robin) --> Target Sequential file.

This has worked for my other jobs however there was only one hash lookup going into my shared container --> link collector -->Target Sequential file.

Thanks,
DS_MJ
DS_MJ
Participant
Posts: 157
Joined: Wed Feb 02, 2005 10:00 am

Post by DS_MJ »

Hi Manteena:

Thanks for your response.

However, there are 4 Shared container's hence 4 links go into the link collector hence cannot use IPC stage between the shared container's and Link collector. IPC stage accepts one input.

Thanks
DS_MJ
manteena
Premium Member
Premium Member
Posts: 38
Joined: Thu Feb 10, 2005 1:43 pm
Location: USA

Post by manteena »

DS_MJ wrote:Hi Manteena:

Thanks for your response.

However, there are 4 Shared container's hence 4 links go into the link collector hence cannot use IPC stage between the shared container's and Link collector. IPC stage accepts one input.

Thanks
DS_MJ
Place 4 IPC's at 4 Shared Container links, Then map 4 o/p links to the LC
SC1-IPC1-LC1 Like wise
DS_MJ
Participant
Posts: 157
Joined: Wed Feb 02, 2005 10:00 am

Post by DS_MJ »

Thanks Manteena:

The compilation completed successfully when I put SC1-->IPC1-->LC1-->Target Seq File
SC2-->IPC2-->LC2-->Target Seq File...and so on as you suggested....!

But still I do not understand :? , why do I have to break LC into 4 pieces with IPC stage in the middle, why wouldnt one LC work...?

Thanks,
Mamta
manteena
Premium Member
Premium Member
Posts: 38
Joined: Thu Feb 10, 2005 1:43 pm
Location: USA

Post by manteena »

DS_MJ wrote:Thanks Manteena:

The compilation completed successfully when I put SC1-->IPC1-->LC1-->Target Seq File
SC2-->IPC2-->LC2-->Target Seq File...and so on as you suggested....!

But still I do not understand :? , why do I have to break LC into 4 pieces with IPC stage in the middle, why wouldnt one LC work...?

Thanks,
Mamta
You can try selecting Inter or in Process at job properties - performance box, so that you can eleminate these intermediate files or IPCs.
By selecting any of these options DS opens temperory Passive stage between the active stages.I think you have active-active-active process between your source and target
DS_MJ
Participant
Posts: 157
Joined: Wed Feb 02, 2005 10:00 am

Post by DS_MJ »

Thanks. Yes I did select the inter-process in the job properties Performance Tab instead of select IPC stage and it works fine.

But the question is why we had to break the LC into 4 parts? why didnt using one LC work..?

Thanks,
Mamta
DS_MJ
Participant
Posts: 157
Joined: Wed Feb 02, 2005 10:00 am

Post by DS_MJ »

Hello:

Never mind. I answered my question. After selecting inter-process in the job properties I can now use only one Link collector. Now when I compile this job I do not get the above error message.

Thanks,
DS_MJ
talk2shaanc
Charter Member
Charter Member
Posts: 199
Joined: Tue Jan 18, 2005 2:50 am
Location: India

Post by talk2shaanc »

Simple solution, without adding any extra stage:
Goto Job Properties -->Performance Tab --> you will see Use project defaults box ticked, uncheck that and tick "Enable Row Buffer" ---> Tick Inter process radio button. set the buffer size in accordance with your data or you can also keep it default.

By using Interprocess-it acts as a passive stage- one active stage will write the rows/records in the buffer and the second active stage will then read the records from the buffer.
Shantanu Choudhary
DS_MJ
Participant
Posts: 157
Joined: Wed Feb 02, 2005 10:00 am

Post by DS_MJ »

Thanks to all of you to help me resolve my issue. Appreciate it ....!

DS_MJ
Post Reply