Extracting Packed decimal data type from A flat file

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

jojipjohn
Participant
Posts: 52
Joined: Wed May 17, 2006 3:40 am
Location: India

Extracting Packed decimal data type from A flat file

Post by jojipjohn »

I have a flat file as source, which is coming from a mainframe system. One of the field in this file is a packed decimal. Can anyone tell me how to extract this field and load it in to an oracle database as a normal decimal.
Joji John
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Packed decimal format equates to the COBOL COMP-3 datatype, and the SDK routines let you convert this value using the DataTypePicComp3 routine.
jojipjohn
Participant
Posts: 52
Joined: Wed May 17, 2006 3:40 am
Location: India

Post by jojipjohn »

Thanks for the response.

I have not used this. Can you please give some more details as to how to do this. This will be of great help for my project.
Joji John
kumar_s
Charter Member
Charter Member
Posts: 5245
Joined: Thu Jun 16, 2005 11:00 pm

Post by kumar_s »

Use transformer to call the routine, pass on the packed decimal column as the argument to this routine and map to integer.
If NLS is enabled in your project, make your source map as none.
Impossible doesn't mean 'it is not possible' actually means... 'NOBODY HAS DONE IT SO FAR'
jojipjohn
Participant
Posts: 52
Joined: Wed May 17, 2006 3:40 am
Location: India

How to define the source stage

Post by jojipjohn »

How will I define the packed decimal in the source sequencial file stage.
Is there any other stages to handle this data type.
My source file is of the following structure:

Source Location NUM
Transaction Code CHAR
Blanket Order Number CHAR
Blanket Order Release Number CHAR
P.O. Item Sequence Number PACKED DECIMAL

What stage can I use for define this source and how to do this.
Joji John
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Just define your source field as character of the appropriate size, typically half of the output size. Character field into the sdk routine in your derivation, integer out.
-craig

"You can never have too many knives" -- Logan Nine Fingers
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

And if its a signed field, then you will use DataTypePicS9() function.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
jojipjohn
Participant
Posts: 52
Joined: Wed May 17, 2006 3:40 am
Location: India

Post by jojipjohn »

I think my packed fields has characters also... I am getting thid error when I used the routine "Error, unable to convert ' Z ' into number".
Could you please tell me how to convert this.
Joji John
kumar_s
Charter Member
Charter Member
Posts: 5245
Joined: Thu Jun 16, 2005 11:00 pm

Post by kumar_s »

jojipjohn wrote:I think my packed fields has characters also... I am getting thid error when I used the routine "Error, unable to convert ' Z ' into number".
Could you please tell me how to convert this.
Your input shouldnt be declared as integer. It was asked to make as character and output after the conversion should be integer.
Impossible doesn't mean 'it is not possible' actually means... 'NOBODY HAS DONE IT SO FAR'
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

What function are you using. Z in the field means its a signed number. The only function available for signed functions is DataTypePicS9().
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Remember that doing a EBCDIC to ASCII conversion on a packed field will "ruin" it. You can always use the BASIC EBCDIC function to re-convert the field so that it can be unpacked.
newtier
Premium Member
Premium Member
Posts: 27
Joined: Mon Dec 13, 2004 5:50 pm
Location: St. Louis, MO

Post by newtier »

We just went through this exercise. In addition to the multiple routines available for the various COBOL COMP formats, you can also use the CFF stage. It works GREAT! (But requires a little time to learn.)

Import your COBOL copy book to save some time (I think it has to be a .cfb file, something like that).

For the packed and/or signed columns, there is a more detailed property page for each column. Right-click on the row in the COLUMNS tab, then select EDIT... (or double-right click on the row number).

To verify the DataStage meta data is consisent with your COBOL layout, at the bottom of that dialog box is the actual COBOL layout. (If you make changes to the definition, you can use the APPLY button to refresh the COBOL property at the bottom.)

Having just learned the CFF stage, I DEFINITELY recommend it over calling the routines in the transformer.

We have NOT tested what happens if you get invalid (non-numeric) data in a field (Sorry)
Rick H
Senior Consultant
jojipjohn
Participant
Posts: 52
Joined: Wed May 17, 2006 3:40 am
Location: India

Post by jojipjohn »

Should I use mainfraem jobs to convert this or can I do this usnig the server jobs.
Joji John
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

You can do this using the server jobs.
-craig

"You can never have too many knives" -- Logan Nine Fingers
jojipjohn
Participant
Posts: 52
Joined: Wed May 17, 2006 3:40 am
Location: India

Post by jojipjohn »

I tried usig a CFF but I getting all the columns as null when I view the data. Can someone please detail on how to use a CFF.
Joji John
Post Reply