Page 1 of 1

Calling a DataStage job from an AS400 command line

Posted: Tue Feb 03, 2004 3:52 am
by virginie
Hello Everybody,
I have developed a datastage application that retrieves files from AS400 all over the world (which is scheduled at 6 every morning). Now the users are asking me to be able to send themselves their files when they are ready. I don't want to give them a hand on DataStage Director.
Is there a way to call a DataSatge job directly from an AS400 command line ?
Thanks
Isabelle

Posted: Tue Feb 03, 2004 4:29 am
by roy
Hi,
any rsh or ftp remote command can be used to run the dsjob command line interface or any local script that will start the job locally.
they will need to get the status as well so you'll need to program that as well, 1 way to do it is use the dsjob cmd line util redirect output to file ftp to the AS400 and parse it.

IHTH,

Posted: Tue Feb 03, 2004 7:01 am
by kcbland
Do you really want to design a system where users run jobs? How to you manage unavailability? How to do you manage resources and load on the machine? What about conflicting jobstreams, how are you going to prevent two different job streams from executing simultaneously if not allowed? This is probably not the best design.

You're better off with your standard schedule to fetch and load data. Then, architect a solution whereby you publish data files of information and the remote users can simply fetch the results from a publish/staging area via FTP. Not only is this more elegant, but it allows you to schedule and coordinate all activities, as well as produce the files consistently.

Posted: Tue Feb 03, 2004 8:17 am
by virginie
Thanks to both of you for your answers;
Actually I don't have the choice and I have to enable the user to send their own data when they ask to. otherwise I will have to check every hour if the concerned files have been updated and if so retrieve them to France (which is no better solution according to me).
Nevertheless, I perfectly agree with you when you say that users shouldn't run jobs. I'll try to send a FTP remote command each calling a different instance of the same job if possible. the files to extract and load are different for each AS400 so I suppose there won't be any conflict.
Thanks again
Isabelle

Posted: Tue Feb 03, 2004 8:46 am
by kcbland
Why don't you use a job control framework whereby users pass requests via some mechanism (intranet, whatever). You could have a polling process watch for requests (request configuration files land in a landing zone containing information such as jobstream name and custom parameter values) and then manage and coordinate running of job streams. At least you have the power from the DataStage side to control which jobstreams are running, as well as deal with conflicting streams. Then, you don't have to deal with remote executions (permissions are always a nightmare if doing remsh or rsh between servers, especially Unix and NT).

Watch process

Posted: Tue Feb 03, 2004 10:39 am
by virginie
Hi Kenneth,

A watch process was actually my first idea. I'm only afraid that a systematic watch for files every hour or half hour might be too heavy on resources for our DataStage Server. Especially for files which are supposed to be sent only during the 1st 5 days of each month.

Isabelle

Posted: Tue Feb 03, 2004 1:23 pm
by roy
Hi,
here is another idea you could concider:
you could give them the ability to remotely make requests simply by letting them run something on your machine, for example run a script with arguments containing the request details that are predetermined.
this script could insert a row in your DB to a request table.
if you put an insert trigger on that table you have your automatic request checker with litle to no overhead for request sampler.
the rest, as far as I can see from the post, you can manage on your own.

IHTH,

Posted: Tue Feb 03, 2004 3:27 pm
by ray.wurlod
Given your constraint of having to let them "send their files", I would limit them to just that. Provide an FTP utility to allow them to send their own files to the DataStage machine in France, and have DataStage check periodically for expected and unexpected arrivals of files in one or more standard locations.
This covers you in the event of unavailability; it's essentially "store and forward" - the FTP utility will fail in the event of either machine being unavailable, but the source file will remain on the source system until everything becomes available again.
I have worked recently on two systems both of which use this approach successfully.