Error with Dataset stage while using instances as target

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
kishore2456
Participant
Posts: 47
Joined: Mon May 07, 2007 10:35 pm

Error with Dataset stage while using instances as target

Post by kishore2456 »

Hi all, i have developed a job in which i have taken lookup on dataset and i'm using the same dataset with two instances in two streams. When i tried to run the job its getting aborted by giving the following error
Here dssource/1 is my dataset


Operator initialization: The data "/dssource/1" is used as output multiple times
Operator initialization: The data "/dssource/1" may not have more than one file/ds override

Can you please tell why this came and how to resolve this
FD
Maveric
Participant
Posts: 388
Joined: Tue Mar 13, 2007 1:28 am

Post by Maveric »

Don't think you can have two instances of the same DataSet. Use a copy stage instead with two output links.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

As well, you should name your Data Set control files with a ".ds" suffix.

Best practice is to create a directory in which to store Data Set and File Set control files.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
kishore2456
Participant
Posts: 47
Joined: Mon May 07, 2007 10:35 pm

Post by kishore2456 »

In my job I'm implementing SCD2 in which i need to divide the flow into tow pipelines one for inserting and other for updating. So definitely i need two instances?? is there any alternate approach?

i didnt get the meaning of "Use a copy stage instead with two output links"
Though i placed a copy stage after the lookup for both i'm getting the same error??? Can u explain more
FD
Maveric
Participant
Posts: 388
Joined: Tue Mar 13, 2007 1:28 am

Post by Maveric »

Got your requirement wrong. I thought you are using same DataSet twice as source. Since you say it is used as target, the same rule applies. Multi instance of same DataSet not allowed in the same job. If the meta data for both the links is same why not use a filter stage, and a flag field to differentiate between inserts and updates?
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

There's no such concept as insert/update for Data Sets. Add a flag column that contains "I" or "U" in the Data Set, and use that to indicate whether the record is to be inserted and updated. Use that in the job that reads from the Data Set subsequently.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply