Page 1 of 1

How to handle DB2 (zOS) unloaded Data in an DS Server Job?

Posted: Tue Aug 03, 2004 4:33 am
by mangrick
Hi all,

I want to process data unloaded from DB2 (zOS) tables in an unix environment using a DS server job.

The data files will be send to the unix system via ftp.

Finaly the processed data should be written in oracle load ready files and loaded into oracle tables (unix).

Questions:
How to deal with the EBSDIC-Code issue?

What is the ideal way / format to unload the tables?

How to ftp those files? As text or binary?

What stages should be used to read the data? Sequential Flat File or Complex Flat File Stage? I think DB2 Stages will not work here.

How to deal with the NULL indicators?

Would it be more performant to access DB2 via DB client directly?

Thanks for your answers.

Mathias

Posted: Tue Aug 03, 2004 5:18 am
by denzilsyb
hi Mathias
How to deal with the EBSDIC-Code issue?
We are also getting db2 data and currently we are using the CFF to read the data. The data is downloaded in binary.

Try avoid the temptation of appending CR/LF at the end of each line when FTPg the data to your server.

I have not hit a NULL character as yet, but I am sure the CFF will handle it. If I do I will see what the CFF has done and if that does not suffice, I will use the datatype transforms that are installed with datastage.

In our case we are given files to work from, but I think we would have more flexibility if we could access the data source directly. Performance wise you dont have to worry about retrieving the data if they send you the files, just make sure the data is always to a specific standard.