How to pass values between jobs?
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 37
- Joined: Thu Nov 25, 2004 8:44 pm
- Location: Bangalore, Karnataka, India
How to pass values between jobs?
Hi
I need to take in as parameter country name, do a look up on a table and return a country ID to all subsequently running jobs in a sequence.
There are a couple of values like this that I'll have to transmit.
How do I do this?
TIA
I need to take in as parameter country name, do a look up on a table and return a country ID to all subsequently running jobs in a sequence.
There are a couple of values like this that I'll have to transmit.
How do I do this?
TIA
Regards,
Vivek RS
Vivek RS
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Your basic choice is how you want to write it to disk; in a text file or in a hashed file or in a database table.
Separate jobs run in separate processes. In theory you could have the one job write to a named pipe and a second job (job sequence) reading from that named pipe, but the coordination of job start times and timeouts would be horrible to manage. Named pipes on Windows are ... "forget about it!"
You could write your own routines (in C) to manage a small piece of shared memory, and store things in there. You would then need to incorporate those routines via the General Call Interface (GCI) into the DataStage executables. Yes, it's doable, but the benefit probably isn't worth the cost.
Also, if it's in a hashed file or a text file, you can use a Routine Activity to read it, and return the value to be picked up by the Job Sequence.
Use a hashed file, use a constant as the key value, and write the value you want to transmit into a non-key column. The nice thing about this approach is that you can use the same hashed files for as many different pieces of information as you want to transmit; just choose a different name for each.
Separate jobs run in separate processes. In theory you could have the one job write to a named pipe and a second job (job sequence) reading from that named pipe, but the coordination of job start times and timeouts would be horrible to manage. Named pipes on Windows are ... "forget about it!"
You could write your own routines (in C) to manage a small piece of shared memory, and store things in there. You would then need to incorporate those routines via the General Call Interface (GCI) into the DataStage executables. Yes, it's doable, but the benefit probably isn't worth the cost.
Also, if it's in a hashed file or a text file, you can use a Routine Activity to read it, and return the value to be picked up by the Job Sequence.
Use a hashed file, use a constant as the key value, and write the value you want to transmit into a non-key column. The nice thing about this approach is that you can use the same hashed files for as many different pieces of information as you want to transmit; just choose a different name for each.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
In a Command activity you could use cat or head to read the file.
In a Routine activity you could invoke a routine (which you'd have to write) that uses OpenSeq, ReadSeq and CloseSeq statements to read the file.
You could customise the Job Sequence code by turning it into Job Control code in a server job, and adding statements in there to read the file.
Search the forum for examples, particularly of the third approach.
In a Routine activity you could invoke a routine (which you'd have to write) that uses OpenSeq, ReadSeq and CloseSeq statements to read the file.
You could customise the Job Sequence code by turning it into Job Control code in a server job, and adding statements in there to read the file.
Search the forum for examples, particularly of the third approach.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Participant
- Posts: 3337
- Joined: Mon Jan 17, 2005 4:49 am
- Location: United Kingdom
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
-
- Charter Member
- Posts: 199
- Joined: Tue Jan 18, 2005 2:50 am
- Location: India
Heyi created routine, in which you can intialize a variable/value to common and then that variable can be used in the same job or across different job.
FUNCTION StoreValues(Operation,StoreValue,Identifier)
Operation valid values STORE or GET. STORE would store a value given for argument 'StoreValue' for a variable identified by argument 'Identifier'. and get would fetch you a value for a variable identified by argument 'Identifier'.
Say i want to store header date (01-01-2005) and i want to use this header date in other job or stages.
first in one of the transformer i will use the routine as;
StoreValues('STORE','01-01-2005','HeaderDate')
and later on where ever you want to use it call it as;
StoreValues('GET','','HeaderDate')
FUNCTION StoreValues(Operation,StoreValue,Identifier)
Code: Select all
Common /SDKRowCompare/ Initialize, LastValue,SeqFile
Equate RoutineName To 'StoreValues'
Open "SDKSequences" TO SeqFile Else
* Open failed. Create the sequence file.
EXECUTE "CREATE.FILE SDKSequences 2 1 1"
Open "SDKSequences" TO SeqFile Else Ans = -1
End
If Operation="GET" Then
* Attempt to read the named record from the file.
Readu LastValue From SeqFile, Identifier then
Ans=LastValue
End
End
* Increment the sequence value, and write back to file.
If Operation='STORE' Then
Writeu StoreValue On SeqFile, Identifier Else Ans = -1
Ans='VALUE STORED'
End
Close SeqFile
Say i want to store header date (01-01-2005) and i want to use this header date in other job or stages.
first in one of the transformer i will use the routine as;
StoreValues('STORE','01-01-2005','HeaderDate')
and later on where ever you want to use it call it as;
StoreValues('GET','','HeaderDate')
Shantanu Choudhary
One caveat. Routines that use COMMON storage can't be used in jobs that have Buffering enabled. At least not reliably used.
Plus the fact that while you can use this in the same job to 'pass' values from one Transformer to another, I'm not sure it's valid to say that it would work 'across different jobs'.
I'd have to go dig though the docs...
Plus the fact that while you can use this in the same job to 'pass' values from one Transformer to another, I'm not sure it's valid to say that it would work 'across different jobs'.
I'd have to go dig though the docs...
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Variables declared to be in COMMON definitely can not be used across jobs. Indeed they are limited to one process, which means that they aren't even shared between the job process and a subordindate Transformer stage process. And, unless the Transformers are directly connected (so that they are executed by the same process), variables in COMMON aren't even shared between two processes executing different Transformer stages in the same job.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Premium Member
- Posts: 224
- Joined: Tue Sep 24, 2002 7:32 am
- Location: Denver, CO USA
-
- Charter Member
- Posts: 199
- Joined: Tue Jan 18, 2005 2:50 am
- Location: India