JobDesign Question
Moderators: chulett, rschirm, roy
JobDesign Question
Hi experts,
I have one process.........
1 there is one query (which wil have date as parameter...)is called in job input
2 Based on data provided from above query records must will be purged from target database table (for this action we can set as target database table) and same records must be get copied into one sequential file as well.
All above I can achieve using one job
But question is that I have to repeat this job for 20-25 tables...
Because query can be different for all table(like in query we can compare that table's DATE column...and based on data parameter provided this query will retrive some set of rows)
so in all here i have 25 tables and 25 queries.
but for one table we can provide input as one query
So how can i automate this process?
or going in plain way i need to design 25 jobs for all 25 tables...and in this job corroeponding query will be called
I have one process.........
1 there is one query (which wil have date as parameter...)is called in job input
2 Based on data provided from above query records must will be purged from target database table (for this action we can set as target database table) and same records must be get copied into one sequential file as well.
All above I can achieve using one job
But question is that I have to repeat this job for 20-25 tables...
Because query can be different for all table(like in query we can compare that table's DATE column...and based on data parameter provided this query will retrive some set of rows)
so in all here i have 25 tables and 25 queries.
but for one table we can provide input as one query
So how can i automate this process?
or going in plain way i need to design 25 jobs for all 25 tables...and in this job corroeponding query will be called
please elaborate more on this
mjmtabor wrote:Create table name and fields you want to check as parameters. Create a schedule that calls the job with proper table and parameters you need. The job will just have different parameters based on what you need to do. This assumes that you check the same number of fields wach time
Have a dummy file go thru a link or transformer and have the ODBC process have a user defined SQL where you have the parameter for the table name and a parameter for the value you are looking for. You could also have the Field_name as a parameter as well and just send it in the job sequence calling the job
DELETE FROM #stage_table# WHERE Field_name = '#field_value#'
DELETE FROM #stage_table# WHERE Field_name = '#field_value#'
i m adding more on this
as per your suggestion i m creating parameter table like this
User definedsql table name parameter name
--------------- --------- -------------------
select * from table 1
where date_field table 1 date_field1
=25-may-2009
select * from table 2
where date_field table 2 date_field2
=25-may-2009
select * from table 3
where date_field table 3 date_field3
=25-may-2009
select * from table 4
where date_field table 4 date_field4
=25-may-2009
here date_field depends upon the date column from that respective table
..should i continue in this way? or are there any more suggestions/additions ?
as per your suggestion i m creating parameter table like this
User definedsql table name parameter name
--------------- --------- -------------------
select * from table 1
where date_field table 1 date_field1
=25-may-2009
select * from table 2
where date_field table 2 date_field2
=25-may-2009
select * from table 3
where date_field table 3 date_field3
=25-may-2009
select * from table 4
where date_field table 4 date_field4
=25-may-2009
here date_field depends upon the date column from that respective table
..should i continue in this way? or are there any more suggestions/additions ?
mjmtabor wrote:Have a dummy file go thru a link or transformer and have the ODBC process have a user defined SQL where you have the parameter for the table name and a parameter for the value you are looking for. You could also have the Field_name as a parameter as well and just send it in the job sequence calling the job
DELETE FROM #stage_table# WHERE Field_name = '#field_value#'
I know, it's just that sometimes it amuses me how long people will spend trying to build a 'one job' solution that could (more than likely) be troublesome to support and maintain when they could have banged out the oh-so-horrible multiple job solution on the first day. But that's just me.
Best of luck. Carry on!
Best of luck. Carry on!
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
But i think...it will problem..because all column names going to be imported into transformer depends upon table name or the query we are going to provide as an input .so how to cope up with this?
mjmtabor wrote:Have a dummy file go thru a link or transformer and have the ODBC process have a user defined SQL where you have the parameter for the table name and a parameter for the value you are looking for. You could also have the Field_name as a parameter as well and just send it in the job sequence calling the job
DELETE FROM #stage_table# WHERE Field_name = '#field_value#'
I'm not sure I understand your question, but if the number of fields you are checking is the same. Have the field name as a parameter also
DELETE FROM #stage_table# WHERE #field_name# = '#field_value#'
In the Job schedule you create you will value
stage_table = table1
field_name = date_field1
field_value = 25-may-2009
DELETE FROM #stage_table# WHERE #field_name# = '#field_value#'
In the Job schedule you create you will value
stage_table = table1
field_name = date_field1
field_value = 25-may-2009
can I prepare a Sequential file to provide as an inout for this?
which will contain.........
table_name column_name
--------------- ----------------
table1 DateField1
table2 DateField2
table3 DateField3
but is there any way to read this parameters file as input parameters so that I read this file to process the underlaying job using accepting above parameters
the underlaying job will process statement like
DELETE FROM #table_name# WHERE #field_name# = 25-may2009
Or
---------------------
In LOOP does anybody knows how to pass two values seperated by comma
like here for each table my date column can be different.........
so in LOOP can I pass like (table1,date_Col1), (table2,date_Col2)
(table3,date_Col3), (table4,date_Col5)...........
and using the values passed in LOOP I can construst a parametrised job .........which will use values like above .....and process the job
statement like DELETE FROM #table_name# WHERE #field_name# = 25-may2009
which will contain.........
table_name column_name
--------------- ----------------
table1 DateField1
table2 DateField2
table3 DateField3
but is there any way to read this parameters file as input parameters so that I read this file to process the underlaying job using accepting above parameters
the underlaying job will process statement like
DELETE FROM #table_name# WHERE #field_name# = 25-may2009
Or
---------------------
In LOOP does anybody knows how to pass two values seperated by comma
like here for each table my date column can be different.........
so in LOOP can I pass like (table1,date_Col1), (table2,date_Col2)
(table3,date_Col3), (table4,date_Col5)...........
and using the values passed in LOOP I can construst a parametrised job .........which will use values like above .....and process the job
statement like DELETE FROM #table_name# WHERE #field_name# = 25-may2009
This might help: viewtopic.php?t=127425
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers