Extracting Packed decimal data type from A flat file
Moderators: chulett, rschirm, roy
Extracting Packed decimal data type from A flat file
I have a flat file as source, which is coming from a mainframe system. One of the field in this file is a packed decimal. Can anyone tell me how to extract this field and load it in to an oracle database as a normal decimal.
Joji John
Packed decimal format equates to the COBOL COMP-3 datatype, and the SDK routines let you convert this value using the DataTypePicComp3 routine.
<a href=http://www.worldcommunitygrid.org/team/ ... TZ9H4CGVP1 target="WCGWin">
</a>
</a>
How to define the source stage
How will I define the packed decimal in the source sequencial file stage.
Is there any other stages to handle this data type.
My source file is of the following structure:
Source Location NUM
Transaction Code CHAR
Blanket Order Number CHAR
Blanket Order Release Number CHAR
P.O. Item Sequence Number PACKED DECIMAL
What stage can I use for define this source and how to do this.
Is there any other stages to handle this data type.
My source file is of the following structure:
Source Location NUM
Transaction Code CHAR
Blanket Order Number CHAR
Blanket Order Release Number CHAR
P.O. Item Sequence Number PACKED DECIMAL
What stage can I use for define this source and how to do this.
Joji John
Your input shouldnt be declared as integer. It was asked to make as character and output after the conversion should be integer.jojipjohn wrote:I think my packed fields has characters also... I am getting thid error when I used the routine "Error, unable to convert ' Z ' into number".
Could you please tell me how to convert this.
Impossible doesn't mean 'it is not possible' actually means... 'NOBODY HAS DONE IT SO FAR'
Remember that doing a EBCDIC to ASCII conversion on a packed field will "ruin" it. You can always use the BASIC EBCDIC function to re-convert the field so that it can be unpacked.
<a href=http://www.worldcommunitygrid.org/team/ ... TZ9H4CGVP1 target="WCGWin">
</a>
</a>
We just went through this exercise. In addition to the multiple routines available for the various COBOL COMP formats, you can also use the CFF stage. It works GREAT! (But requires a little time to learn.)
Import your COBOL copy book to save some time (I think it has to be a .cfb file, something like that).
For the packed and/or signed columns, there is a more detailed property page for each column. Right-click on the row in the COLUMNS tab, then select EDIT... (or double-right click on the row number).
To verify the DataStage meta data is consisent with your COBOL layout, at the bottom of that dialog box is the actual COBOL layout. (If you make changes to the definition, you can use the APPLY button to refresh the COBOL property at the bottom.)
Having just learned the CFF stage, I DEFINITELY recommend it over calling the routines in the transformer.
We have NOT tested what happens if you get invalid (non-numeric) data in a field (Sorry)
Import your COBOL copy book to save some time (I think it has to be a .cfb file, something like that).
For the packed and/or signed columns, there is a more detailed property page for each column. Right-click on the row in the COLUMNS tab, then select EDIT... (or double-right click on the row number).
To verify the DataStage meta data is consisent with your COBOL layout, at the bottom of that dialog box is the actual COBOL layout. (If you make changes to the definition, you can use the APPLY button to refresh the COBOL property at the bottom.)
Having just learned the CFF stage, I DEFINITELY recommend it over calling the routines in the transformer.
We have NOT tested what happens if you get invalid (non-numeric) data in a field (Sorry)
Rick H
Senior Consultant
Senior Consultant