Page 1 of 2

Switching between sequence and job

Posted: Tue Jun 10, 2008 1:10 pm
by mydsworld
I need to run a Job from sequence.Then inside the Job, I derive a few columns based on which I need to fire different jobs.

Can I set some User variable inside a job then access them from the sequence.How to do that ?

Posted: Tue Jun 10, 2008 2:31 pm
by Minhajuddin
When you say you want to start a job (say 'X') from a sequence(say 'Seq_A'). You can do this only when your job ('X') is a sequence. So, you can call a sequence or a job from a sequence. But, Datastage doesn't have any stage (in server or parallel edition) which will call a job from a Parallel/server job.

Posted: Tue Jun 10, 2008 2:42 pm
by chulett
You need some way for the job to stash / land that data in such a way that the Sequence job can pick up and pass along. For a Server job I'd suggest USERSTATUS, however either job type can write data to a flat file and a Sequence routine can pick them up and pass them as job parameters to downstream jobs.

Posted: Tue Jun 10, 2008 2:52 pm
by mydsworld
If I use UserStatus inside Job,how to assign value to that inside the job.

Posted: Tue Jun 10, 2008 2:57 pm
by chulett
Search the forums for USERSTATUS, it's all out there including some code that may come in handy.

Posted: Tue Jun 10, 2008 2:58 pm
by mydsworld
Here is my requirement :

I have a sequence MySeq that has job J1.

Now J1 produces N records (each containing say 5 columns)

Now based on each record column value, I need to call different jobs say J2,J3 etc in the sequence MySeq.

How to impplement that.

Posted: Tue Jun 10, 2008 4:21 pm
by chulett
Off the top of my head, a routine to read those values in and a Nested Condition stage to split the flow from there to the right Job Activity. Perhaps all in a looping structure supported by a UserVariables Activity stage.

Posted: Tue Jun 10, 2008 6:06 pm
by mydsworld
How can we use Routine to read those values in. If I store the N rows into a database table (in job J1), will I be able to read those values using Routines from sequence.

Posted: Tue Jun 10, 2008 7:15 pm
by chulett
Land them. A routine can then build the delimited list that the UserVariables and/or Start Loop stages are looking for. Perhaps by something as simple as cat'ing them to standard out and capturing that output. You could also use a function like Convert() to flatten the dynamic array into a comma delimited list. Many ways to skin this cat.

Posted: Tue Jun 10, 2008 8:55 pm
by mydsworld
Can we use USERSTATUS instead of Routine to capture the delimited values.

Posted: Tue Jun 10, 2008 9:03 pm
by ray.wurlod
Yes.

You would need to parse them out of the $UserStatus activity variable, probably using Field() functions.

Posted: Tue Jun 10, 2008 11:42 pm
by Minhajuddin
mydsworld wrote:Here is my requirement :

I have a sequence MySeq that has job J1.

Now J1 produces N records (each containing say 5 columns)

Now based on each record column value, I need to call different jobs say J2,J3 etc in the sequence MySeq.

How to impplement that.
This seems simple to me, I hope I am not missing anything.
My assumption is that you would know beforehand what jobs J2,J3 .. are.

For now let's say that job J1 generates data which can be potentially be processed by jobs J2, J3, J4 (based on your column which decides the job)

Design your job J1 in such a way that it creates three datasets after passing the last output link through a transformer and then to three datasets(say J2.ds, J3,ds, J4.ds). These three links can be constrained in such a way that only the records that have to go to job J3 go to J3.ds. After this you just have to connect these four jobs through a sequence. You can trigger jobs J2,J3,J4 simultaneously after J1 finishes successfully.

Posted: Wed Jun 11, 2008 12:51 am
by JoshGeorge
If you want to call different jobs from the current processing job (i.e. J1) try:

Server Job: Write a small routine which will just attach and run the job.
Parallel Job: Try External Filter with dsjob command or try a parallel routine.

Posted: Wed Jun 11, 2008 3:11 am
by ag_ram
Minhajuddin wrote:For now let's say that job J1 generates data which can be potentially be processed by jobs J2, J3, J4 (based on your column which decides the job)

Design your job J1 in such a way that it creates three datasets after passing the last output link through a transformer and then to three datasets(say J2.ds, J3,ds, J4.ds). These three links can be constrained in such a way that only the records that have to go to job J3 go to J3.ds. After this you just have to connect these four jobs through a sequence. You can trigger jobs J2,J3,J4 simultaneously after J1 finishes successfully.
But i feel that this suggestion would result,

1. Unwanted creation of datasets. (in every run two unwanted datasets)
2. Unwanted execution of the rest of the Jobs.

I suppose that chulett had given enough solution for this problem.

Posted: Wed Jun 11, 2008 6:03 am
by Minhajuddin
ag_ram wrote: But i feel that this suggestion would result,

1. Unwanted creation of datasets. (in every run two unwanted datasets)
2. Unwanted execution of the rest of the Jobs.

I suppose that chulett had given enough solution for this problem.
I don't know about others, But, If I find a simple way to solve a problem, I would rather go with that solution instead of using a Rube Goldberg kind of solution. Moreover the OP has not mentioned that the data to be processed is mutually exclusive.