JobDesign Question

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

qutesanju
Participant
Posts: 373
Joined: Tue Aug 26, 2008 4:52 am

JobDesign Question

Post by qutesanju »

Hi experts,
I have one process.........
1 there is one query (which wil have date as parameter...)is called in job input
2 Based on data provided from above query records must will be purged from target database table (for this action we can set as target database table) and same records must be get copied into one sequential file as well.

All above I can achieve using one job

But question is that I have to repeat this job for 20-25 tables...
Because query can be different for all table(like in query we can compare that table's DATE column...and based on data parameter provided this query will retrive some set of rows)

so in all here i have 25 tables and 25 queries.
but for one table we can provide input as one query

So how can i automate this process?
or going in plain way i need to design 25 jobs for all 25 tables...and in this job corroeponding query will be called
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Will the output metadata / columns always be the same, regardless of table or query logic? If so, perhaps the "sql file" option would help here, leverage a job parameter to drive which one to use at runtime.
-craig

"You can never have too many knives" -- Logan Nine Fingers
qutesanju
Participant
Posts: 373
Joined: Tue Aug 26, 2008 4:52 am

Post by qutesanju »

nope......the output columns depends upon..the query provided
chulett wrote:Will the output metadata / columns always be the same, regardless of table or query logic? If so, perhaps the "sql file" option would help here, leverage a job parameter to drive which one to u ...
mjmtabor
Participant
Posts: 7
Joined: Wed Feb 28, 2007 8:03 am

Post by mjmtabor »

Create table name and fields you want to check as parameters. Create a schedule that calls the job with proper table and parameters you need. The job will just have different parameters based on what you need to do. This assumes that you check the same number of fields wach time
qutesanju
Participant
Posts: 373
Joined: Tue Aug 26, 2008 4:52 am

Post by qutesanju »

please elaborate more on this
mjmtabor wrote:Create table name and fields you want to check as parameters. Create a schedule that calls the job with proper table and parameters you need. The job will just have different parameters based on what you need to do. This assumes that you check the same number of fields wach time
mjmtabor
Participant
Posts: 7
Joined: Wed Feb 28, 2007 8:03 am

Post by mjmtabor »

Have a dummy file go thru a link or transformer and have the ODBC process have a user defined SQL where you have the parameter for the table name and a parameter for the value you are looking for. You could also have the Field_name as a parameter as well and just send it in the job sequence calling the job

DELETE FROM #stage_table# WHERE Field_name = '#field_value#'
qutesanju
Participant
Posts: 373
Joined: Tue Aug 26, 2008 4:52 am

Post by qutesanju »

i m adding more on this

as per your suggestion i m creating parameter table like this

User definedsql table name parameter name
--------------- --------- -------------------
select * from table 1
where date_field table 1 date_field1
=25-may-2009

select * from table 2
where date_field table 2 date_field2
=25-may-2009

select * from table 3
where date_field table 3 date_field3
=25-may-2009

select * from table 4
where date_field table 4 date_field4
=25-may-2009

here date_field depends upon the date column from that respective table

..should i continue in this way? or are there any more suggestions/additions ?

mjmtabor wrote:Have a dummy file go thru a link or transformer and have the ODBC process have a user defined SQL where you have the parameter for the table name and a parameter for the value you are looking for. You could also have the Field_name as a parameter as well and just send it in the job sequence calling the job

DELETE FROM #stage_table# WHERE Field_name = '#field_value#'
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

You could have built your 25 jobs days ago and be long done by now. :wink:
-craig

"You can never have too many knives" -- Logan Nine Fingers
qutesanju
Participant
Posts: 373
Joined: Tue Aug 26, 2008 4:52 am

Post by qutesanju »

the straight foreward procedure i know.but i want to automate this..that's why i posted this thread :lol:
chulett wrote:You could have built your 25 jobs days ago and be long done by now. :wink: ...
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

I know, it's just that sometimes it amuses me how long people will spend trying to build a 'one job' solution that could (more than likely) be troublesome to support and maintain when they could have banged out the oh-so-horrible multiple job solution on the first day. But that's just me.

Best of luck. Carry on! :wink:
-craig

"You can never have too many knives" -- Logan Nine Fingers
qutesanju
Participant
Posts: 373
Joined: Tue Aug 26, 2008 4:52 am

Post by qutesanju »

But i think...it will problem..because all column names going to be imported into transformer depends upon table name or the query we are going to provide as an input .so how to cope up with this?
mjmtabor wrote:Have a dummy file go thru a link or transformer and have the ODBC process have a user defined SQL where you have the parameter for the table name and a parameter for the value you are looking for. You could also have the Field_name as a parameter as well and just send it in the job sequence calling the job

DELETE FROM #stage_table# WHERE Field_name = '#field_value#'
mjmtabor
Participant
Posts: 7
Joined: Wed Feb 28, 2007 8:03 am

Post by mjmtabor »

I'm not sure I understand your question, but if the number of fields you are checking is the same. Have the field name as a parameter also

DELETE FROM #stage_table# WHERE #field_name# = '#field_value#'

In the Job schedule you create you will value

stage_table = table1
field_name = date_field1
field_value = 25-may-2009
qutesanju
Participant
Posts: 373
Joined: Tue Aug 26, 2008 4:52 am

Post by qutesanju »

can I prepare a Sequential file to provide as an inout for this?

which will contain.........

table_name column_name
--------------- ----------------
table1 DateField1
table2 DateField2
table3 DateField3

but is there any way to read this parameters file as input parameters so that I read this file to process the underlaying job using accepting above parameters

the underlaying job will process statement like
DELETE FROM #table_name# WHERE #field_name# = 25-may2009

Or
---------------------
In LOOP does anybody knows how to pass two values seperated by comma
like here for each table my date column can be different.........

so in LOOP can I pass like (table1,date_Col1), (table2,date_Col2)
(table3,date_Col3), (table4,date_Col5)...........
and using the values passed in LOOP I can construst a parametrised job .........which will use values like above .....and process the job
statement like DELETE FROM #table_name# WHERE #field_name# = 25-may2009
qutesanju
Participant
Posts: 373
Joined: Tue Aug 26, 2008 4:52 am

Post by qutesanju »

:D
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

This might help: viewtopic.php?t=127425
-craig

"You can never have too many knives" -- Logan Nine Fingers
Post Reply