Page 1 of 1

Reading a file from Mainframe server

Posted: Thu Feb 22, 2007 11:09 am
by vnspn
Hi,

I'm using DS version 7.1.

I need to read a Mainframe file from a Mainframe machine. I'm currently using FTP stage to read the data from the remote machine and then process it.

For metadata to be used in the FTP stage, I import the copybook layout. But this does not read the file correctly. I need to modify the copybook layout to some extent before importing it. Only then, the data gets read correctly.

I would like to know what is the best method to read a file from the Mainframe machine. Complex Flat File stage is available only in a Mainframe Job. Is there any better approach in Server Job to read from a Mainframe machine than what I do here? Its because modifying the 1000 line copybook layouts is being a tedious task.

Thanks.

Posted: Thu Feb 22, 2007 11:46 am
by ArndW
Complex Flat files are availabe in Server, PX and Mainframe jobs.

Accessing the data is via FTP stage as you've already done or a standard FTP program (or perhaps your site has enabled other tools such as rcp to the host). These are good and efficient approaches.

If your copybooks need "massaging" then any approach you take is going to involve manual labor that I don't think you will be able to avoid.

Posted: Thu Feb 22, 2007 1:56 pm
by ray.wurlod
Arnd has govered the gamut of approaches.

I'd be asking "them" why the COBOL file definition does not match the COBOL file.
:roll:

Posted: Thu Feb 22, 2007 6:46 pm
by vmcburney
Isn't it usually the case that you have to make some minor changes to a cobol copybook to get DataStage to recognise it? DataStage seems picky about levels and comments and syntax.

Posted: Fri Feb 23, 2007 10:10 am
by vnspn
The Cobol file definition matches the Cobol file, thats correct. But when I import that table definition, I get details like level number in the metadata.

Hope these kinds of metadata with level number can be recognized only by Complex Flat File stage and not by stages like FTP. Thats the reason I'm making some minor changes to the table definition such that all columns that are needed are in the same level number. Only then the FTP stage reads the data correctly.

Anybody have thought on this!

Posted: Fri Feb 23, 2007 3:04 pm
by ray.wurlod
Metadata for table definitions are stored generically. You can prove this easily; open the table definition and select the Layout tab. You can then view the same table definition as an SQL table definition, an Orchestrate record schema or as a COBOL file definition.

I guess level numbers might interfere with the way that the FTP stage might work; it's not something I've ever needed to do. I've always been constrained (by mainframe security/politics) to working with files pushed from the mainframe.

Posted: Mon Feb 26, 2007 3:50 pm
by vnspn
I import the Cobol layout which is in a .txt format. This new metadata of the table definition has a 'Level Number' attribute for each columns. This is the way, its different from the other generic table definitions. This causes the problem in reading the source file correctly.

Also the Cobol layout file has some 'Redefines' that makes the source data to be read incorrectly.

To overcome all these some tweaks are being needed to be done in the layout .txt file before importing it.

Is this a correct way of doing things?

Thanks.

Posted: Mon Feb 26, 2007 5:14 pm
by kumar_s
Even Server jobs has CFF stage as plugin. :roll: