Interesting issue
Moderators: chulett, rschirm, roy
Interesting issue
I have a job that is looking for a file to start. The problem is that if the file is still being writen when it starts looking the job aborts. I know there is an easy way to have a dos batch file test to see if the file is finished being writen and then it can call a comand to start a job...I do this all the time using SQL server and DTS jobs, but we are migrating off of SQL server and I need similar functionality in DS. I dont mind having a Batch file sniff for the job and then kick off a DS job...the question that I have is what would the code to start the DS job look like?
Jim
Jim
Sure I need help....But who dosent?
Thanx thats what I neededchulett wrote:Server Job Developers Guide - look for the "Command Line Interface" section. Typically, Chapter 14 I believe. Spells out the syntax and usage of the "dsjob" command which is what you'll need in your batch to launch a DataStage job.
Jim
Sure I need help....But who dosent?
The command line interface to running jobs is "dsjob". This executable is heavily discussed on this forum. Search for it and you'll get a zillion hits. Think of it has a command line Director: you can start, stop, and reset jobs. You can pull out job log information and all kinds of goodies.
This program is probably the best way to have your jobs initiated by a third-party piece of software, such as an enterprise scheduler.
This program is probably the best way to have your jobs initiated by a third-party piece of software, such as an enterprise scheduler.
Kenneth Bland
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
from what I read about it in the online manual it apears that I will need two lines of code..One to log on and one to run the job iekcbland wrote:The command line interface to running jobs is "dsjob". This executable is heavily discussed on this forum. Search for it and you'll get a zillion hits. Think of it has a command line Director: you can start, stop, and reset jobs. You can pull out job log information and all kinds of goodies.
This program is probably the best way to have your jobs initiated by a third-party piece of software, such as an enterprise scheduler.
dsjob -server Scrbbususcnc04 -user condji -password ********
dsjob -run Development\WQASImport
there dont seem to be any good examples...though i have just browsed it over. Am I correct in my thinking?
Jim
Sure I need help....But who dosent?
I did and it looks very indept...but its way over my head right now. I have it bookmarked so that one day Ill be able to get back to it and understand what it is doing in totality.kcbland wrote:Check out the post I just put up: Shell script to start a job using dsjob
Jim
Sure I need help....But who dosent?
Why do you do all these ? Heres a simple trick.
Let your source process create a control file/ Indicator file as soon the data is written completely to the datafile. Let your process keep looking for the control/indicator file and once this file appears, it implies that data has been completely written to the source datafile and now its ready to be run.
Ex. If data is being written to file "Sourcedata.dat" then create another file "SourceCntlFile" which just would have a line. Now let your process look for this file "SourceCntFile" as this is created after Sourcedata.dat.
This should fix your problem of starting your process when source file is getting created.
Rahul
Let your source process create a control file/ Indicator file as soon the data is written completely to the datafile. Let your process keep looking for the control/indicator file and once this file appears, it implies that data has been completely written to the source datafile and now its ready to be run.
Ex. If data is being written to file "Sourcedata.dat" then create another file "SourceCntlFile" which just would have a line. Now let your process look for this file "SourceCntFile" as this is created after Sourcedata.dat.
This should fix your problem of starting your process when source file is getting created.
Rahul
I dont controle the source process...I can not create the triger fileRahul wrote:Why do you do all these ? Heres a simple trick.
Let your source process create a control file/ Indicator file as soon the data is written completely to the datafile. Let your process keep looking for the control/indicator file and once this file appears, it implies that data has been completely written to the source datafile and now its ready to be run.
Ex. If data is being written to file "Sourcedata.dat" then create another file "SourceCntlFile" which just would have a line. Now let your process look for this file "SourceCntFile" as this is created after Sourcedata.dat.
This should fix your problem of starting your process when source file is getting created.
Rahul
Jim
Sure I need help....But who dosent?
There's no way to know if an inbound file is complete, no matter what type of sender, without the sender acknowledging the file transfer is complete.
The previously posted "ready" file approach works really well. Another approach is to create the file under a restricted set of permissions and chmod when complete. This allows the "instantaneous" appearance of a complete file.
I see no alternative that does not involve changing the sending process, or the process that controls the sending process. Somehow you need to know it's done sending, otherwise, there could be buffered data in the pipeline you don't see yet. Sampling the file and resampling a few minutes later for no changes is an unreliable way of determining if the file transfer is complete. Again, network congestion, processing bottlenecks, and buffering could be delaying the arrival of the data.
Your's is a classic problem solved many times over with standard solutions. This is a lesson learned a long time ago, yet apparently someone upstream of you was skipping school.
The previously posted "ready" file approach works really well. Another approach is to create the file under a restricted set of permissions and chmod when complete. This allows the "instantaneous" appearance of a complete file.
I see no alternative that does not involve changing the sending process, or the process that controls the sending process. Somehow you need to know it's done sending, otherwise, there could be buffered data in the pipeline you don't see yet. Sampling the file and resampling a few minutes later for no changes is an unreliable way of determining if the file transfer is complete. Again, network congestion, processing bottlenecks, and buffering could be delaying the arrival of the data.
Your's is a classic problem solved many times over with standard solutions. This is a lesson learned a long time ago, yet apparently someone upstream of you was skipping school.
Kenneth Bland
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
to remidy my upsteam friends ill just write a simple batch file that tries to rename the file and if it cant it will go into a loop/sleep state. once it can rename the file it will then rename it again to the file that the ds job needs and then kick of the ds job. its simple, archaic but well suited for my needs. It seems to be the best way for what I have to do.kcbland wrote:There's no way to know if an inbound file is complete, no matter what type of sender, without the sender acknowledging the file transfer is complete.
The previously posted "ready" file approach works really well. Another approach is to create the file under a restricted set of permissions and chmod when complete. This allows the "instantaneous" appearance of a complete file.
I see no alternative that does not involve changing the sending process, or the process that controls the sending process. Somehow you need to know it's done sending, otherwise, there could be buffered data in the pipeline you don't see yet. Sampling the file and resampling a few minutes later for no changes is an unreliable way of determining if the file transfer is complete. Again, network congestion, processing bottlenecks, and buffering could be delaying the arrival of the data.
Your's is a classic problem solved many times over with standard solutions. This is a lesson learned a long time ago, yet apparently someone upstream of you was skipping school.
Thanx for all the help guys its invaluable
Jim
Sure I need help....But who dosent?