How to pass values between jobs?

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
vivek_rs
Participant
Posts: 37
Joined: Thu Nov 25, 2004 8:44 pm
Location: Bangalore, Karnataka, India

How to pass values between jobs?

Post by vivek_rs »

Hi
I need to take in as parameter country name, do a look up on a table and return a country ID to all subsequently running jobs in a sequence.

There are a couple of values like this that I'll have to transmit.

How do I do this?

TIA
Regards,
Vivek RS
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Your basic choice is how you want to write it to disk; in a text file or in a hashed file or in a database table.

Separate jobs run in separate processes. In theory you could have the one job write to a named pipe and a second job (job sequence) reading from that named pipe, but the coordination of job start times and timeouts would be horrible to manage. Named pipes on Windows are ... "forget about it!"

You could write your own routines (in C) to manage a small piece of shared memory, and store things in there. You would then need to incorporate those routines via the General Call Interface (GCI) into the DataStage executables. Yes, it's doable, but the benefit probably isn't worth the cost.

Also, if it's in a hashed file or a text file, you can use a Routine Activity to read it, and return the value to be picked up by the Job Sequence.

Use a hashed file, use a constant as the key value, and write the value you want to transmit into a non-key column. The nice thing about this approach is that you can use the same hashed files for as many different pieces of information as you want to transmit; just choose a different name for each.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
vivek_rs
Participant
Posts: 37
Joined: Thu Nov 25, 2004 8:44 pm
Location: Bangalore, Karnataka, India

Post by vivek_rs »

Hey ray,
I would like to read from a simple sequential file which has a couple of values and put them as parameters into subsequent jobs in the sequence. Please help...
Regards,
Vivek RS
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

In a Command activity you could use cat or head to read the file.

In a Routine activity you could invoke a routine (which you'd have to write) that uses OpenSeq, ReadSeq and CloseSeq statements to read the file.

You could customise the Job Sequence code by turning it into Job Control code in a server job, and adding statements in there to read the file.

Search the forum for examples, particularly of the third approach.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
rbanavar
Participant
Posts: 2
Joined: Wed Oct 20, 2004 2:57 pm

Post by rbanavar »

Good idea. Except, I would be concerned to use Head if you are working on the Ent (Parallel) Edition. I have observed inconsistent results when I used the Head stage to parse the Header of a Seq File, when we ran a parallel job. Love to hear any comments on this.
Sainath.Srinivasan
Participant
Posts: 3337
Joined: Mon Jan 17, 2005 4:49 am
Location: United Kingdom

Post by Sainath.Srinivasan »

You can call 'head' command as an external source. This will not affect the nature of the command.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Plus you labelled your post as Server and posted it on the Server jobs forum. So you got a server answer. 8)
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
talk2shaanc
Charter Member
Charter Member
Posts: 199
Joined: Tue Jan 18, 2005 2:50 am
Location: India

Post by talk2shaanc »

Heyi created routine, in which you can intialize a variable/value to common and then that variable can be used in the same job or across different job.

FUNCTION StoreValues(Operation,StoreValue,Identifier)

Code: Select all

 Common /SDKRowCompare/ Initialize, LastValue,SeqFile

      Equate RoutineName To 'StoreValues'


      Open "SDKSequences" TO SeqFile Else
         * Open failed. Create the sequence file.
         EXECUTE "CREATE.FILE SDKSequences 2 1 1"
         Open "SDKSequences" TO SeqFile Else Ans = -1
      End




      If Operation="GET" Then
         * Attempt to read the named record from the file.
         Readu LastValue From SeqFile, Identifier then
            Ans=LastValue
         End
      End


      * Increment the sequence value, and write back to file.	
      If Operation='STORE' Then
         Writeu StoreValue On SeqFile, Identifier Else Ans = -1
         Ans='VALUE STORED'
      End
      Close SeqFile
Operation valid values STORE or GET. STORE would store a value given for argument 'StoreValue' for a variable identified by argument 'Identifier'. and get would fetch you a value for a variable identified by argument 'Identifier'.


Say i want to store header date (01-01-2005) and i want to use this header date in other job or stages.

first in one of the transformer i will use the routine as;
StoreValues('STORE','01-01-2005','HeaderDate')

and later on where ever you want to use it call it as;

StoreValues('GET','','HeaderDate')
Shantanu Choudhary
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

One caveat. Routines that use COMMON storage can't be used in jobs that have Buffering enabled. At least not reliably used. :wink:

Plus the fact that while you can use this in the same job to 'pass' values from one Transformer to another, I'm not sure it's valid to say that it would work 'across different jobs'. :?

I'd have to go dig though the docs...
-craig

"You can never have too many knives" -- Logan Nine Fingers
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Variables declared to be in COMMON definitely can not be used across jobs. Indeed they are limited to one process, which means that they aren't even shared between the job process and a subordindate Transformer stage process. And, unless the Transformers are directly connected (so that they are executed by the same process), variables in COMMON aren't even shared between two processes executing different Transformer stages in the same job.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
dstechdev
Participant
Posts: 10
Joined: Thu May 27, 2004 6:54 am
Location: Plano, Texas

Post by dstechdev »

I did something similiar to this but I broke it up into two routines. The first routine stored the keyed value in one job and the second routine retrieved it in a subsequent job

john
ds_developer
Premium Member
Premium Member
Posts: 224
Joined: Tue Sep 24, 2002 7:32 am
Location: Denver, CO USA

Post by ds_developer »

As you can see from the code, he is actually reading and writing to a file. This would allow access from other jobs, stages, etc.

John
talk2shaanc
Charter Member
Charter Member
Posts: 199
Joined: Tue Jan 18, 2005 2:50 am
Location: India

Post by talk2shaanc »

Ya exactly I am actually using a Universe file 'SDKSequence', which is used by In-Built, transform-'KeyMgtGetNextValue' and 'KeyMgtGetNextValueConcurrent'.
Shantanu Choudhary
Post Reply