I'm wondering if anyone has used DataStage to read an IMS unload file from the mainframe. I'm asking to find out if anyone knows it is possible or not, and for tips.
In the past (at other places), mainframe folks would FTP files to the DataStage server in a ready-to-use format (ASCII delimited text files). Here, we are considering if we process the IMS unload files without having to write COBOL programs.
I don't have my hands on the file yet to know the layout. I can only assume it will be EBCDIC and contain packed decimal fields.
If we FTP the file from mainframe to AIX, will an ASCII FTP automatically convert from EBCDIC to ASCII? Is it better to do a binary FTP then use DataStage to convert it to ASCII?
Do people usually use the Complex Flat File stage for cases like this? I've not used that stage before. I'm not sure if one file will contain multiple record types or not, but what I understand from IMS is that it's hierarchical rather than relational and has parent and child segments we'll have to match up.
Appreciate any tips... I searched on IMS and IMS unload but found no topics (except for matches on "ims" like "claims"). Thanks!
general questions on processing IMS unload files
Moderators: chulett, rschirm, roy
general questions on processing IMS unload files
Choose a job you love, and you will never have to work a day in your life. - Confucius
I can't speak for accessing IMS files in particular, but I would caution against an ASCII FTP if the files are EBCDIC and contain ANYTHING other than alphanumeric data. ANY non-alphanumeric data is subject to corruption when ASCII FTP'd from an EBCDIC system...FTP has no concept of the record layout and will attempt to convert every byte from EBCDIC to ASCII.
Regards,
Regards,
- james wiles
All generalizations are false, including this one - Mark Twain.
All generalizations are false, including this one - Mark Twain.
Host-based data transfer utilities have built-in translation from EBCDIC to ASCII, and my experience with DataStage has been consistent with that. I also have no direct experience with CFF, but I use FTP Enterprise extensively with no problems. What you end up using is dependent on your shop's connectivity capabilities and security requirements.
In my case, I rarely land files on the Unix server from the host. FTP Enterprise is the first stage and data flows very well through my jobs.
Your basic requirements: use binary mode, record type implicit, all-zeroes allowed yes, and a consistent record format in the IMS files you source. Cobol FD import in DataStage also works very well, with just a little bit of hand-holding.
In my case, I rarely land files on the Unix server from the host. FTP Enterprise is the first stage and data flows very well through my jobs.
Your basic requirements: use binary mode, record type implicit, all-zeroes allowed yes, and a consistent record format in the IMS files you source. Cobol FD import in DataStage also works very well, with just a little bit of hand-holding.
Franklin Evans
"Shared pain is lessened, shared joy increased. Thus do we refute entropy." -- Spider Robinson
Using mainframe data FAQ: viewtopic.php?t=143596 Using CFF FAQ: viewtopic.php?t=157872
"Shared pain is lessened, shared joy increased. Thus do we refute entropy." -- Spider Robinson
Using mainframe data FAQ: viewtopic.php?t=143596 Using CFF FAQ: viewtopic.php?t=157872
Old Post . But Im tempted to leave a reply . We have Datastage Mainframe edition , I can create a job to read and IMS file and convert it to a fixed width or delimited flat file.
First you import the DBD ( database descriptor ) and PSB ( Prog spec block ) using IMS def import.
You get a schema view when you apply it to the IMS stage canvase, from there you can select the hierarchy path & flatten the arrays and write to an output link ( which can go thro a Mainframe transformer ) and then to a Fixed width flat file or any other target ( FTP ) , these can then be processed by a conventional parallel or server job. Im somewhat new to Mainframe jobs , and we only have 25 existing Datastage Mainframe jobs running on Z/OS. IMS use has been discontinued in my current organization ( mostly Db2/zOS now) , but some associates use it.
HTH. Mainframe job licensing in DS is a seperate bill , but my current client has it .
You may also consider using a Multi-Format Flat file stage if you need to read an IMS download file that has a structure supported by the MFF stage (using Complex file load option)
First you import the DBD ( database descriptor ) and PSB ( Prog spec block ) using IMS def import.
You get a schema view when you apply it to the IMS stage canvase, from there you can select the hierarchy path & flatten the arrays and write to an output link ( which can go thro a Mainframe transformer ) and then to a Fixed width flat file or any other target ( FTP ) , these can then be processed by a conventional parallel or server job. Im somewhat new to Mainframe jobs , and we only have 25 existing Datastage Mainframe jobs running on Z/OS. IMS use has been discontinued in my current organization ( mostly Db2/zOS now) , but some associates use it.
HTH. Mainframe job licensing in DS is a seperate bill , but my current client has it .
You may also consider using a Multi-Format Flat file stage if you need to read an IMS download file that has a structure supported by the MFF stage (using Complex file load option)