Page 1 of 2

Interesting issue

Posted: Fri Nov 07, 2003 7:43 am
by JDionne
I have a job that is looking for a file to start. The problem is that if the file is still being writen when it starts looking the job aborts. I know there is an easy way to have a dos batch file test to see if the file is finished being writen and then it can call a comand to start a job...I do this all the time using SQL server and DTS jobs, but we are migrating off of SQL server and I need similar functionality in DS. I dont mind having a Batch file sniff for the job and then kick off a DS job...the question that I have is what would the code to start the DS job look like?
Jim

Posted: Fri Nov 07, 2003 7:59 am
by chulett
Server Job Developers Guide - look for the "Command Line Interface" section. Typically, Chapter 14 I believe. Spells out the syntax and usage of the "dsjob" command which is what you'll need in your batch to launch a DataStage job.

Posted: Fri Nov 07, 2003 8:02 am
by JDionne
chulett wrote:Server Job Developers Guide - look for the "Command Line Interface" section. Typically, Chapter 14 I believe. Spells out the syntax and usage of the "dsjob" command which is what you'll need in your batch to launch a DataStage job.
Thanx thats what I needed
Jim

Posted: Fri Nov 07, 2003 8:02 am
by kcbland
The command line interface to running jobs is "dsjob". This executable is heavily discussed on this forum. Search for it and you'll get a zillion hits. Think of it has a command line Director: you can start, stop, and reset jobs. You can pull out job log information and all kinds of goodies.

This program is probably the best way to have your jobs initiated by a third-party piece of software, such as an enterprise scheduler.

Posted: Fri Nov 07, 2003 8:14 am
by JDionne
kcbland wrote:The command line interface to running jobs is "dsjob". This executable is heavily discussed on this forum. Search for it and you'll get a zillion hits. Think of it has a command line Director: you can start, stop, and reset jobs. You can pull out job log information and all kinds of goodies.

This program is probably the best way to have your jobs initiated by a third-party piece of software, such as an enterprise scheduler.
from what I read about it in the online manual it apears that I will need two lines of code..One to log on and one to run the job ie

dsjob -server Scrbbususcnc04 -user condji -password ********
dsjob -run Development\WQASImport

there dont seem to be any good examples...though i have just browsed it over. Am I correct in my thinking?
Jim

Posted: Fri Nov 07, 2003 8:23 am
by chulett
Nope, one "line" does everything - logon, run, reset - whatever. Just string all the keywords together.

Posted: Fri Nov 07, 2003 8:35 am
by JDionne
so
dsjob -server Scrbbususcnc04 -user condji -password ******** -run Development\WQASImport

is the line of code that I need to use

Thanx for the help
Jim

Posted: Fri Nov 07, 2003 8:38 am
by kcbland
Check out the post I just put up: Shell script to start a job using dsjob

Posted: Fri Nov 07, 2003 8:43 am
by JDionne
kcbland wrote:Check out the post I just put up: Shell script to start a job using dsjob
I did and it looks very indept...but its way over my head right now. I have it bookmarked so that one day Ill be able to get back to it and understand what it is doing in totality.
Jim

Posted: Fri Nov 07, 2003 10:48 am
by Rahul
Why do you do all these ? Heres a simple trick.

Let your source process create a control file/ Indicator file as soon the data is written completely to the datafile. Let your process keep looking for the control/indicator file and once this file appears, it implies that data has been completely written to the source datafile and now its ready to be run.

Ex. If data is being written to file "Sourcedata.dat" then create another file "SourceCntlFile" which just would have a line. Now let your process look for this file "SourceCntFile" as this is created after Sourcedata.dat.

This should fix your problem of starting your process when source file is getting created.


Rahul

Posted: Fri Nov 07, 2003 10:53 am
by JDionne
Rahul wrote:Why do you do all these ? Heres a simple trick.

Let your source process create a control file/ Indicator file as soon the data is written completely to the datafile. Let your process keep looking for the control/indicator file and once this file appears, it implies that data has been completely written to the source datafile and now its ready to be run.

Ex. If data is being written to file "Sourcedata.dat" then create another file "SourceCntlFile" which just would have a line. Now let your process look for this file "SourceCntFile" as this is created after Sourcedata.dat.

This should fix your problem of starting your process when source file is getting created.


Rahul
I dont controle the source process...I can not create the triger file
Jim

Posted: Fri Nov 07, 2003 11:17 am
by kcbland
There's no way to know if an inbound file is complete, no matter what type of sender, without the sender acknowledging the file transfer is complete.

The previously posted "ready" file approach works really well. Another approach is to create the file under a restricted set of permissions and chmod when complete. This allows the "instantaneous" appearance of a complete file.

I see no alternative that does not involve changing the sending process, or the process that controls the sending process. Somehow you need to know it's done sending, otherwise, there could be buffered data in the pipeline you don't see yet. Sampling the file and resampling a few minutes later for no changes is an unreliable way of determining if the file transfer is complete. Again, network congestion, processing bottlenecks, and buffering could be delaying the arrival of the data.

Your's is a classic problem solved many times over with standard solutions. This is a lesson learned a long time ago, yet apparently someone upstream of you was skipping school. :(

Posted: Fri Nov 07, 2003 11:25 am
by JDionne
kcbland wrote:There's no way to know if an inbound file is complete, no matter what type of sender, without the sender acknowledging the file transfer is complete.

The previously posted "ready" file approach works really well. Another approach is to create the file under a restricted set of permissions and chmod when complete. This allows the "instantaneous" appearance of a complete file.

I see no alternative that does not involve changing the sending process, or the process that controls the sending process. Somehow you need to know it's done sending, otherwise, there could be buffered data in the pipeline you don't see yet. Sampling the file and resampling a few minutes later for no changes is an unreliable way of determining if the file transfer is complete. Again, network congestion, processing bottlenecks, and buffering could be delaying the arrival of the data.

Your's is a classic problem solved many times over with standard solutions. This is a lesson learned a long time ago, yet apparently someone upstream of you was skipping school. :(
to remidy my upsteam friends ill just write a simple batch file that tries to rename the file and if it cant it will go into a loop/sleep state. once it can rename the file it will then rename it again to the file that the ds job needs and then kick of the ds job. its simple, archaic but well suited for my needs. It seems to be the best way for what I have to do.
Thanx for all the help guys its invaluable
Jim

Posted: Fri Nov 07, 2003 1:58 pm
by kduke
The way control m and other similar products work is they check the file size wait and check it again. If it quits growing then it is complete.

Kim.

Posted: Fri Nov 07, 2003 2:03 pm
by JDionne
kduke wrote:The way control m and other similar products work is they check the file size wait and check it again. If it quits growing then it is complete.

Kim.
Control M is unix based isnt it?
Jim