linear/cyclic depend error on vertical pivot

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
tjmalone
Participant
Posts: 8
Joined: Thu Jul 14, 2005 7:06 pm

linear/cyclic depend error on vertical pivot

Post by tjmalone »

Hi just a question I am getting a "linear or cyclic dependency" error message with the job. The job is trying to preform a vertical pivot. With the four hashed files and the main stream coming from the same source

Any ideads on how to solve/avoid this problem would be greatly appreciated

Regards

Travis
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

The error message basically means you have created a 'closed loop' job stream - you need to resolve that before the job will compile or even think about running.

Make sure there is a separate beginning and end to your job, that one doesn't point back to the other.
-craig

"You can never have too many knives" -- Logan Nine Fingers
tjmalone
Participant
Posts: 8
Joined: Thu Jul 14, 2005 7:06 pm

Post by tjmalone »

The problem appears to occur at a hashed file, which is both being written to and then read. Is there anyway to split this up or something, but ensure that it doesn't read from it until it has finished being written to.

Thanks

Travis
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Sure. Actually, that kind of thing (dependencies between different 'sections' of your job, that is) should be handled automatically. :?

Best to post an actual image of your job design, Travis, if you don't mind. Check the next to last post on the first page of this post for an easy method to accomplish this.

Take two shakes after that, I would think. :wink:
-craig

"You can never have too many knives" -- Logan Nine Fingers
tjmalone
Participant
Posts: 8
Joined: Thu Jul 14, 2005 7:06 pm

Post by tjmalone »

Hi, it appears i have solved my problem. Just saved both the main steam, and the other three steams (all of which came from the same source) To hashed files, and then brought them into one transformer for the maaping/vertical pivot.

So this will suffice for the moment, but is there away to avoid mapping the main stream into a hased file to map across, or is this even a problem? (i'm kinda new to data stage).

The job is:

a sequal file stage running into one transformer and then from there being written into 4 hashed files based on a criteria, and then a vertical pivot is being performed to merge the four rows, after which the row is written to its destination

Regards

Travis
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Use a separate Hashed File stage to update the hashed file from the stream. In the first stage, where you are reading from the hashed file, make sure that you set "lock for update", and do not use caching in either Hashed File stage.
Both Hashed File stages refer to the same hashed file.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply