Calling a DataStage job from an AS400 command line

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
virginie
Participant
Posts: 7
Joined: Mon Nov 24, 2003 11:12 am

Calling a DataStage job from an AS400 command line

Post by virginie »

Hello Everybody,
I have developed a datastage application that retrieves files from AS400 all over the world (which is scheduled at 6 every morning). Now the users are asking me to be able to send themselves their files when they are ready. I don't want to give them a hand on DataStage Director.
Is there a way to call a DataSatge job directly from an AS400 command line ?
Thanks
Isabelle
roy
Participant
Posts: 2598
Joined: Wed Jul 30, 2003 2:05 am
Location: Israel

Post by roy »

Hi,
any rsh or ftp remote command can be used to run the dsjob command line interface or any local script that will start the job locally.
they will need to get the status as well so you'll need to program that as well, 1 way to do it is use the dsjob cmd line util redirect output to file ftp to the AS400 and parse it.

IHTH,
Roy R.
Time is money but when you don't have money time is all you can afford.

Search before posting:)

Join the DataStagers team effort at:
http://www.worldcommunitygrid.org
Image
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

Do you really want to design a system where users run jobs? How to you manage unavailability? How to do you manage resources and load on the machine? What about conflicting jobstreams, how are you going to prevent two different job streams from executing simultaneously if not allowed? This is probably not the best design.

You're better off with your standard schedule to fetch and load data. Then, architect a solution whereby you publish data files of information and the remote users can simply fetch the results from a publish/staging area via FTP. Not only is this more elegant, but it allows you to schedule and coordinate all activities, as well as produce the files consistently.
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
virginie
Participant
Posts: 7
Joined: Mon Nov 24, 2003 11:12 am

Post by virginie »

Thanks to both of you for your answers;
Actually I don't have the choice and I have to enable the user to send their own data when they ask to. otherwise I will have to check every hour if the concerned files have been updated and if so retrieve them to France (which is no better solution according to me).
Nevertheless, I perfectly agree with you when you say that users shouldn't run jobs. I'll try to send a FTP remote command each calling a different instance of the same job if possible. the files to extract and load are different for each AS400 so I suppose there won't be any conflict.
Thanks again
Isabelle
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

Why don't you use a job control framework whereby users pass requests via some mechanism (intranet, whatever). You could have a polling process watch for requests (request configuration files land in a landing zone containing information such as jobstream name and custom parameter values) and then manage and coordinate running of job streams. At least you have the power from the DataStage side to control which jobstreams are running, as well as deal with conflicting streams. Then, you don't have to deal with remote executions (permissions are always a nightmare if doing remsh or rsh between servers, especially Unix and NT).
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
virginie
Participant
Posts: 7
Joined: Mon Nov 24, 2003 11:12 am

Watch process

Post by virginie »

Hi Kenneth,

A watch process was actually my first idea. I'm only afraid that a systematic watch for files every hour or half hour might be too heavy on resources for our DataStage Server. Especially for files which are supposed to be sent only during the 1st 5 days of each month.

Isabelle
roy
Participant
Posts: 2598
Joined: Wed Jul 30, 2003 2:05 am
Location: Israel

Post by roy »

Hi,
here is another idea you could concider:
you could give them the ability to remotely make requests simply by letting them run something on your machine, for example run a script with arguments containing the request details that are predetermined.
this script could insert a row in your DB to a request table.
if you put an insert trigger on that table you have your automatic request checker with litle to no overhead for request sampler.
the rest, as far as I can see from the post, you can manage on your own.

IHTH,
Roy R.
Time is money but when you don't have money time is all you can afford.

Search before posting:)

Join the DataStagers team effort at:
http://www.worldcommunitygrid.org
Image
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Given your constraint of having to let them "send their files", I would limit them to just that. Provide an FTP utility to allow them to send their own files to the DataStage machine in France, and have DataStage check periodically for expected and unexpected arrivals of files in one or more standard locations.
This covers you in the event of unavailability; it's essentially "store and forward" - the FTP utility will fail in the event of either machine being unavailable, but the source file will remain on the source system until everything becomes available again.
I have worked recently on two systems both of which use this approach successfully.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply