packed decimal

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
dodda
Premium Member
Premium Member
Posts: 244
Joined: Tue May 29, 2007 11:31 am

packed decimal

Post by dodda »

Hi

I have a problem while converting packed decimal. I have a Fixed widht flat file coming from Mainframe. I have imported the cobol copy book definions and saved as table definitions and i am trying to read the Fixed width flat file by using sequential file stage. I am using column import to parse the records and loading the cobol copy book definition to split each row into multiple columns. after loading the cobol copy book defitions one the columns has packed decimal and i need to convert that to signed integer.

The column value
000005138{

my output is sequential file. when i map this column value via transformer i am getting 0 in the file.

appreciate your help

Thanks
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Try using a Complex Flat File stage, which has got this functionality built right in.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
nirdesh2
Participant
Posts: 56
Joined: Thu Nov 20, 2008 12:18 pm
Location: Noida

Post by nirdesh2 »

We have also faced the same problem. First you should use Complex flat file to read the data. If your data consist packed decimal values then read the data file through Complex flat file stage and Charcter Set should be EBCDIC and Data Format should be Binary.
Nirdesh Kumar
Mike
Premium Member
Premium Member
Posts: 1021
Joined: Sun Mar 03, 2002 6:01 pm
Location: Tampa, FL

Re: packed decimal

Post by Mike »

dodda wrote:The column value
000005138{
Technically, this is a zoned decimal (with overpunched sign) and not a packed decimal, but the advice for using the CFF stage still applies.

Mike
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Yup, technically. :wink:

Harkens me back to my Hollerith / punch card days where you could actually see the "over punch" needed for this.

And yes, I am a member of the Old Fart club, thank you very much. :lol:
-craig

"You can never have too many knives" -- Logan Nine Fingers
beaudean
Participant
Posts: 3
Joined: Fri May 01, 2009 9:16 am

Post by beaudean »

I am having an issue moving a job from server to parallel. This a CFF and I have saved the file layout from my server job and imported it into my parallel job, but I keep getting an error when I try to view the data in the file. I can view the data up until I include the PMT_FORMAT_CODE (below) defined in COBOL Copybook as S9(1) and when I define it as a CHAR it comes back, but of course unreadable. I have copied part of the file definition below. Any thoughts would be appreciated.

Regards,
Jeff

#################################################################
#### STAGE: OPMTPACA
## Operator
import
## Operator options
-schema record
{record_length=475, delim=none, quote=none, binary, ebcdic, native_endian, round=round_inf, nofix_zero}
(
FILLER_1_1:raw[1];
PMT_PROD_ACCT_NO:decimal[9,0] {packed};
PMT_CREDIT_ACCT_NO:decimal[11,0] {packed};
PMT_CARDHOLDER_NO:string[19];
PMT_POSTING_DATE:decimal[8,0] {packed};
FILLER_37_40:raw[4];
PMT_TRANSACTION_CODE:decimal[3,0] {packed};
PMT_FORMAT_CODE:decimal[1,0] {zoned};
vivekgadwal
Premium Member
Premium Member
Posts: 457
Joined: Tue Sep 25, 2007 4:05 pm

Re: packed decimal

Post by vivekgadwal »

Mike wrote:
dodda wrote:The column value
000005138{
Technically, this is a zoned decimal (with overpunched sign) and not a packed decimal, but the advice for using the CFF stage still applies.

Mike
Hello,

I have been having the same issue with reading zoned decimals and a topic on that is still active:

viewtopic.php?t=131188&postdays=0&postorder=asc&start=0

Dodda - Please try the approaches that the gurus have mentioned in that and see if any of them work. Please do let us know the results because I am anxiously waiting for them (unfortunately, for my problem, it didn't work!)

Thanks.
Vivek Gadwal

Experience is what you get when you didn't get what you wanted
sshri
Participant
Posts: 22
Joined: Fri Dec 21, 2007 3:13 pm

Post by sshri »

You have to use the datastage transform DataTypePicS9 to convert the signed or zoned integer to integer
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

...and how do you propose they use a server job Transform in a parallel job?
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
sshri
Participant
Posts: 22
Joined: Fri Dec 21, 2007 3:13 pm

Post by sshri »

Sorry, I didn't realize the requirement is for Parallel jobs.

In Signed Integer the sign is embedded in the last bit of the data. You can interpret it as mentioned below

Negative Values
} -0
J -1
K -2
L -3
M -4
N -5
O -6
P -7
Q -8
R -9

Positive Values
{ 0
A 1
B 2
C 3
D 4
E 5
F 6
G 7
H 8
I 9
vivekgadwal
Premium Member
Premium Member
Posts: 457
Joined: Tue Sep 25, 2007 4:05 pm

Post by vivekgadwal »

sshri wrote:Sorry, I didn't realize the requirement is for Parallel jobs.

In Signed Integer the sign is embedded in the last bit of the data. You can interpret it as mentioned below

Negative Values
} -0
J -1
K -2
L -3
M -4
N -5
O -6
P -7
Q -8
R -9

Positive Values
{ 0
A 1
B 2
C 3
D 4
E 5
F 6
G 7
H 8
I 9
I am unsure if the OP got the solution to this issue. But, I got mine by using a Sequential File stage and setting the Type Default in the following way:

Code: Select all

Packed: no (overpunch)
Sign: trailing **Please set this explicitly even this is the default**
Good luck and let us know if this works for you.
Vivek Gadwal

Experience is what you get when you didn't get what you wanted
Post Reply