FTP a file in a Sequence Job

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
VCInDSX
Premium Member
Premium Member
Posts: 223
Joined: Fri Apr 13, 2007 10:02 am
Location: US

FTP a file in a Sequence Job

Post by VCInDSX »

Hi,
I (re)searched several threads on FTP and found a lot of horror stories. Often times, it turns out that the best approach would be to use command line FTP and then get on with the actual job.
My requirements are as follows.
1. Server Job1 creates an XML file from a datbase table
2. An "After-Job" step compresses this XML file into a ".ZIP" file. (Only if the Job was successful)
3. If the "Zip" file was created, then need to FTP this to another server.
4. Send an email notification to predefined list of users/support team.

At this time:-
I desgined and tested Job1 and it works fine.
I created a Sequence Job to invoke Job1. Job1 has the zipfile name as a job parameter.
I have setup a notification activity and that works fine too. (Just runs Job1 and notifies users. No FTP yet :( )
I am not clear how to plug the filename from Job1 into another job that would perform the FTP operation.
Right now, I am thinking of using the FTP stage in another Job called "Job2" and having it do the FTP. I am trying to understand the FTP stage properties.... and not clear on why one should provide all those column information. All i need to do is upload ONE file.

OR should i use the ExcecuteCommand in the Sequence canvas, push the "Zip" file and be done with it. In this approach, how much error handling can i support?

In either case, what are the mechanisms to pass the file name from Job1 to another job? Should i save such inter-job parameters in a global space?

Apologies for loading this .... I can split this into multiple posts, if required

Thanks in advance,
-V
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

The FTP stage in DataStage has the advantage that it treats the data on a line-by-line basis, whereas the command-line FTP interface just moves the file as a whole. When using the DS FTP Stage it means that a long-running FTP can be processed row by row as it comes in or goes out instead of having to wait for the transfer to complete before processing.
This is analogous to using a named pipe as opposed to a sequential file.

In your case the file is transferred as a ZIP file, so it makes more sense to use the standard FTP. If you do wish to use DataStage, then you just need to ensure that the stage defines a binary transfer and the column metadata is irrelevant, as the data format isn't structured. So use a single column with some arbitrary length (4096?) and make sure you write out with no column or line separators so the format isn't modified by DS.
VCInDSX
Premium Member
Premium Member
Posts: 223
Joined: Fri Apr 13, 2007 10:02 am
Location: US

Post by VCInDSX »

Hi ArndW,
Thanks for your time and response. Appreciate it much.
To keep it simple, i wanted to create a simple job that uses an existing file on the server for its upload operation.
Here is what i have in mind.
SeqFileStage (Configure Zip file name) ==> FTP Stage (FTP folder et al).

Is this the right approach? I wonder if all properties of Seq file stage can be satisfied when i work with "Zip" file.... as i wouldn't have any column information....

Well, looks like scripting is the way to go.....

Thanks,
-V
Post Reply