Output to table based on value in field
Moderators: chulett, rschirm, roy
Output to table based on value in field
Hi,
Some help and advice on the following would be appreciated:
I have one job, I want to be able to output to multiple tables dynamically based on a value in a field. The target is AS/400 files (tables), all have the same metadata.
Example: Business Unit (BU) holds 8 distinct values, I want to output to 8 tables dynamically for each BU. A BU of 393 should be output to TABLENAME393, BU of 392 = TABLENAME392 etc
I want to avoid hardcoding by having 8 output links, with 8 constraints in the transformer. More BU's could be added in the future. So I would like this job to be generic with 1 output link.
I have looked into using multiple instance jobs, with an invocation ID for each BU, but this is still hardcoding.
Some help and advice on the following would be appreciated:
I have one job, I want to be able to output to multiple tables dynamically based on a value in a field. The target is AS/400 files (tables), all have the same metadata.
Example: Business Unit (BU) holds 8 distinct values, I want to output to 8 tables dynamically for each BU. A BU of 393 should be output to TABLENAME393, BU of 392 = TABLENAME392 etc
I want to avoid hardcoding by having 8 output links, with 8 constraints in the transformer. More BU's could be added in the future. So I would like this job to be generic with 1 output link.
I have looked into using multiple instance jobs, with an invocation ID for each BU, but this is still hardcoding.
Re: Output to table based on value in field
What you try to do cannot be completely achieved using the GUI jobs, you could do it via a BASIC job.
If you want to make a generic job that writes to a "dynamic" table this can be done as long as all of the tables have the same DDL. You could enter the table-name as a parameter in the job and start up the job with a specific parameter to a specific table using the argument in the output stage. But this does not work on row-basis.
On row basis the only way to my knowledge is a BASIC job.
Ogmios
If you want to make a generic job that writes to a "dynamic" table this can be done as long as all of the tables have the same DDL. You could enter the table-name as a parameter in the job and start up the job with a specific parameter to a specific table using the argument in the output stage. But this does not work on row-basis.
On row basis the only way to my knowledge is a BASIC job.
Ogmios
In theory there's no difference between theory and practice. In practice there is.
-
- Premium Member
- Posts: 483
- Joined: Thu Jun 12, 2003 4:47 pm
- Location: St. Louis, Missouri USA
-
- Premium Member
- Posts: 385
- Joined: Wed Jun 16, 2004 12:43 pm
- Location: Virginia, USA
- Contact:
How about this:
Create an intermediate hash file with the BU number as part of the primary key. This file would mirror the structure of the tables you are trying to populate.
Next, create a simple job that reads the hash file with a select parameterized phrase, and write to your AS400 tables with parameterized table names.
Your hash file select phrase would look something like:
Your (AS400) stage tables name would look something like:
Lastly, create a job sequencer that runs the above job in a loop for each possible BU number.
All of this is straight forward, GUI based development.
Create an intermediate hash file with the BU number as part of the primary key. This file would mirror the structure of the tables you are trying to populate.
Next, create a simple job that reads the hash file with a select parameterized phrase, and write to your AS400 tables with parameterized table names.
Code: Select all
HASH ---> XFR ---> AS400
Code: Select all
BU = #BuNumber#
Code: Select all
TABLENAME#BuNumber#
All of this is straight forward, GUI based development.
Chuck Smith
www.anotheritco.com
www.anotheritco.com
Personally I was thinking of a complete BASIC job... not only the controlling job. Open
Instead of the hashfiles in the above mail you could also load them to database tables to accomplish the same. The "problem" with using arguments is that they cannot be assigned during runtime of the job, the job itself cannot assign variables based upon row values.
If you do it with a GUI job and running it several times in a row with different arguments you do have to take care of restartability, ...
Ogmios
Instead of the hashfiles in the above mail you could also load them to database tables to accomplish the same. The "problem" with using arguments is that they cannot be assigned during runtime of the job, the job itself cannot assign variables based upon row values.
If you do it with a GUI job and running it several times in a row with different arguments you do have to take care of restartability, ...
Ogmios
In theory there's no difference between theory and practice. In practice there is.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
You could create a multi-instance job (perhaps use the BU as invocation ID) supplying the appropriate BU job parameter value for each invocation.
Execute them from a job sequence that feeds each parameter value appropriately. Each job processes only the data for its particular BU; it's best to ensure this in the extraction phase. (You can even run the jobs in parallel but don't overload your CPUs!)
This is not the appropriate solution if the domain of BUs is unknown in advance but, if this were the case, how would you guarantee that the target table(s) would be available?
Execute them from a job sequence that feeds each parameter value appropriately. Each job processes only the data for its particular BU; it's best to ensure this in the extraction phase. (You can even run the jobs in parallel but don't overload your CPUs!)
This is not the appropriate solution if the domain of BUs is unknown in advance but, if this were the case, how would you guarantee that the target table(s) would be available?
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Participant
- Posts: 3593
- Joined: Thu Jan 23, 2003 5:25 pm
- Location: Australia, Melbourne
- Contact:
A completely maintainable approach would see the list of BUs being read in by a routine which calls up a multiple instance job for each BU in the list. This list could be a flat file, or a delimited job parameter or a database table.
Starting a job from a job control routine is very easy. I suggest you play around with a batch job for a bit, add your multiple instance job to a batch job using the combo box and copy and paste the code into your routine.
Reading values from a file is covered in this forum (look for sequence commands) and in the server job developers guide.
You just need a loop that for each BU starts up an instance of the job and passes it the BU as the instance ID. Within the job you can put invocation id into the table name field using TABLENAME#DSInvocationId#.
An administrator can then add or remove BUs by changing the file or job parameter that feeds the list to the job.
Starting a job from a job control routine is very easy. I suggest you play around with a batch job for a bit, add your multiple instance job to a batch job using the combo box and copy and paste the code into your routine.
Reading values from a file is covered in this forum (look for sequence commands) and in the server job developers guide.
You just need a loop that for each BU starts up an instance of the job and passes it the BU as the instance ID. Within the job you can put invocation id into the table name field using TABLENAME#DSInvocationId#.
An administrator can then add or remove BUs by changing the file or job parameter that feeds the list to the job.
Certus Solutions
Blog: Tooling Around in the InfoSphere
Twitter: @vmcburney
LinkedIn:Vincent McBurney LinkedIn
Blog: Tooling Around in the InfoSphere
Twitter: @vmcburney
LinkedIn:Vincent McBurney LinkedIn