access same job
Moderators: chulett, rschirm, roy
access same job
HI All,
we have a job design in which all the jobs( we have like 75 main jobs) calls the same subjob(one subjob), it works fine now because we are running it mannually. but once it goes into production does it create problem ? because we will be running them in parallel and some jobs will access the same subjob while it is been used by other.( we did parametirized the path and hash file name so that it takes different name for different job on the fly)
i was trying to analyze this problem and it came to my mind that if we try to open the job that is already opened by other person then we will get a message saying that it is accessed by other person, does the ds behave in the same way in here or it is different for the above problem??
thank you
kris
we have a job design in which all the jobs( we have like 75 main jobs) calls the same subjob(one subjob), it works fine now because we are running it mannually. but once it goes into production does it create problem ? because we will be running them in parallel and some jobs will access the same subjob while it is been used by other.( we did parametirized the path and hash file name so that it takes different name for different job on the fly)
i was trying to analyze this problem and it came to my mind that if we try to open the job that is already opened by other person then we will get a message saying that it is accessed by other person, does the ds behave in the same way in here or it is different for the above problem??
thank you
kris
kris
You will have a problem in Production as a 'normal' job can only be run by any one given controlling process at a time. If one 'main' job has it running, any other that attempt to start it will error.
You'll need to make your job a Multi-Instance job. Search the forum if you're not familiar with that term and read the Help on the subject availble from the Director client.
Also ensure that it is designed to be run by multiple processes at the same time. For example, don't have them all write to one sequential filename or clear the same hashed file. Make sure any objects like that have unique names (typically based on the Invocation ID but can be other parameters) which is easy to accomplish.
It sounds like you've started down that path with your parameters, but it will still need to be a multi-instance job.
You'll need to make your job a Multi-Instance job. Search the forum if you're not familiar with that term and read the Help on the subject availble from the Director client.
Also ensure that it is designed to be run by multiple processes at the same time. For example, don't have them all write to one sequential filename or clear the same hashed file. Make sure any objects like that have unique names (typically based on the Invocation ID but can be other parameters) which is easy to accomplish.
It sounds like you've started down that path with your parameters, but it will still need to be a multi-instance job.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
-
- Charter Member
- Posts: 822
- Joined: Sat Sep 17, 2005 5:25 pm
- Location: USA