Controlling job flow by counting rows in hashed files - ???
Posted: Wed Sep 26, 2007 10:11 am
Hello to all the DS Gurus!
I have an interesting situation and I would like opinionions from those who may have dealt with this type of situation. Hopefully, this isn't just me having a mental speed bump and overcomplicating things.
(That would be embarassing)
Here is the background:
I have 16 companies that run the same coded jobs. All filenames, regardless of whether they are sequential files or hashed files, are dynamically created based on parameters passed in through Ken Bland's job control architecture. Given the 194 tables at 11 jobs per table that have to be processed for each company, this equates to a large number of jobs and files being processed each night. I have about a 5 hour window to process all systems to completion. I am rewriting a process that is currently being done by SQL Server 2000 DTS packages. I need to rebuild the process as it will be be moving to another RDBMS at some point in the future and I don't want to do this twice as changes are required now.
Here is the need:
I am looking for a way to stop the process for tables that don't need to run. I figure the best way to do this is to count the number of rows in the surrogate key hashed files (Step 3 of 11). If there is more than one row in the Inserts or Updates hashed files, then the jobs need to run. If not, then the rest of the processing for that table and that company should stop.
The names of the files are dynamic and pathed, so using the SETFILE command to make a VOC entry just to get a count and then having to delete any entries as soon as processing is done seems inefficient in my mind.
I believe the entries have to be out of the VOC before my cleanup process runs. I am in the process of building an app in Perl to go and move all files built for a particular batch into a tarball, zip it and store in an archive directory until it has been backed up. Then it is deleted. (My version of housekeeping).
I am open to any other creative/mundane suggestions/ideas and would appreciate any input our community could offer.
I have an interesting situation and I would like opinionions from those who may have dealt with this type of situation. Hopefully, this isn't just me having a mental speed bump and overcomplicating things.
![Embarassed :oops:](./images/smilies/icon_redface.gif)
(That would be embarassing)
Here is the background:
I have 16 companies that run the same coded jobs. All filenames, regardless of whether they are sequential files or hashed files, are dynamically created based on parameters passed in through Ken Bland's job control architecture. Given the 194 tables at 11 jobs per table that have to be processed for each company, this equates to a large number of jobs and files being processed each night. I have about a 5 hour window to process all systems to completion. I am rewriting a process that is currently being done by SQL Server 2000 DTS packages. I need to rebuild the process as it will be be moving to another RDBMS at some point in the future and I don't want to do this twice as changes are required now.
Here is the need:
I am looking for a way to stop the process for tables that don't need to run. I figure the best way to do this is to count the number of rows in the surrogate key hashed files (Step 3 of 11). If there is more than one row in the Inserts or Updates hashed files, then the jobs need to run. If not, then the rest of the processing for that table and that company should stop.
The names of the files are dynamic and pathed, so using the SETFILE command to make a VOC entry just to get a count and then having to delete any entries as soon as processing is done seems inefficient in my mind.
I believe the entries have to be out of the VOC before my cleanup process runs. I am in the process of building an app in Perl to go and move all files built for a particular batch into a tarball, zip it and store in an archive directory until it has been backed up. Then it is deleted. (My version of housekeeping).
I am open to any other creative/mundane suggestions/ideas and would appreciate any input our community could offer.