This is not going to help your current situation but I quit using the CFF stage long time ago. I never could get it to work completely for all possible combinations of packings/low and high values on the mainframe, which seems to be the situation you're in.
What I currently do is to generate ASCII files on the mainframe (outside of DataStage), unpacking all packed fields and stripping low- and high-values. One of our mainframe guys wrote a pretty general routine to do that on the mainframe side.
Ogmios
Reading Ebcdic file Using CFF - Urgent!!!!!
Moderators: chulett, rschirm, roy
Re: Reading Ebcdic file Using CFF - Urgent!!!!!
In theory there's no difference between theory and practice. In practice there is.
Hello Ravi,
were you just showing us some examples of the errors or all of the errors you found? The reason I am asking is that it is unlikely to not work for some columns while working in others.
The Cobol type conversions will work - I know because I wrote them (a couple of years back).
Are you pulling the file over as ASCII or EBCDIC; i.e. if you are letting DS convert the files and THEN try to do EBCDIC conversions it will cause the types of errors you are alluding to. How are you getting your source data file to the UNIX server box? If you are using FTP then you must use "BINARY" mode otherwise FTP will do that character set conversion for you.
If your COMP-3 fields are working while X and S9 fields aren't it does sound like you are doing an ASCII conversion prior to working on the fields.
were you just showing us some examples of the errors or all of the errors you found? The reason I am asking is that it is unlikely to not work for some columns while working in others.
The Cobol type conversions will work - I know because I wrote them (a couple of years back).
Are you pulling the file over as ASCII or EBCDIC; i.e. if you are letting DS convert the files and THEN try to do EBCDIC conversions it will cause the types of errors you are alluding to. How are you getting your source data file to the UNIX server box? If you are using FTP then you must use "BINARY" mode otherwise FTP will do that character set conversion for you.
If your COMP-3 fields are working while X and S9 fields aren't it does sound like you are doing an ASCII conversion prior to working on the fields.
<a href=http://www.worldcommunitygrid.org/team/ ... TZ9H4CGVP1 target="WCGWin">
</a>
</a>
-
- Participant
- Posts: 22
- Joined: Tue Sep 28, 2004 9:58 am
Hi Arnd,
Our problem got solved. We aren't interpreted the cobol file structure properly. Our cobol file strucure giving three types of records Ctl,Header and Detail. Each record type having different fields.So it is giving junk data when we view the data in CFF stage. Now we splitt the data in transformer based on record type and stored to three output files. It's working fine now.
Thnax for ur input.
Ravi.
Our problem got solved. We aren't interpreted the cobol file structure properly. Our cobol file strucure giving three types of records Ctl,Header and Detail. Each record type having different fields.So it is giving junk data when we view the data in CFF stage. Now we splitt the data in transformer based on record type and stored to three output files. It's working fine now.
Thnax for ur input.
Ravi.
ArndW wrote:Hello Ravi,
were you just showing us some examples of the errors or all of the errors you found? The reason I am asking is that it is unlikely to not work for some columns while working in others.
The Cobol type conversions will work - I know because I wrote them (a couple of years back).
Are you pulling the file over as ASCII or EBCDIC; i.e. if you are letting DS convert the files and THEN try to do EBCDIC conversions it will cause the types of errors you are alluding to. How are you getting your source data file to the UNIX server box? If you are using FTP then you must use "BINARY" mode otherwise FTP will do that character set conversion for you.
If your COMP-3 fields are working while X and S9 fields aren't it does sound like you are doing an ASCII conversion prior to working on the fields.
-
- Premium Member
- Posts: 385
- Joined: Wed Jun 16, 2004 12:43 pm
- Location: Virginia, USA
- Contact:
I have a tool that might help with future problems. It creates a dump of the file which you can compare with you metadata. The tool is Create a hex/ascii dump of an ebcdic cobol file. You will find it on the DataStage Tools page of www.anotheritco.com.
Chuck Smith
www.anotheritco.com
www.anotheritco.com
Ravi,
How are you getting the file from mainframe? are you using DataStage FTP stage to pull the file ? even I coudnt make CFF work for me. I was pullling the file using FTP stage, the meta data for the file was not aligned properly with the incoming data and I had all sorts of nightmares.
Mack
How are you getting the file from mainframe? are you using DataStage FTP stage to pull the file ? even I coudnt make CFF work for me. I was pullling the file using FTP stage, the meta data for the file was not aligned properly with the incoming data and I had all sorts of nightmares.
Mack
______________________________________
"Everytime I close the door on reality, it comes in through the windows." - Jennifer Yane
"Everytime I close the door on reality, it comes in through the windows." - Jennifer Yane