Reading Packed Decimals in Complex Flat File stage

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

srds2
Premium Member
Premium Member
Posts: 66
Joined: Tue Nov 29, 2011 6:56 pm

Reading Packed Decimals in Complex Flat File stage

Post by srds2 »

Hi, I am getting all 0's (though that column has proper values) while trying to read Packed decimals using Complex Flat File stage in Datastage version 8.5.Below are my stage settings.

- Data Format: Binary
- Character set: EBCDIC (Tried with ASCII as well)
- I have given 0 as the default value for this column (Packed Decimal)

Can anyone please let me know if there are any specific settings to be done in order to read packed decimals.

Thanks in advance.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Let's start with you confirming the PICTURE clause of your Packed Decimal fields.
-craig

"You can never have too many knives" -- Logan Nine Fingers
srds2
Premium Member
Premium Member
Posts: 66
Joined: Tue Nov 29, 2011 6:56 pm

Post by srds2 »

Hi Chulett, It is PIC 9(11) COMP-3.

Thanks.
FranklinE
Premium Member
Premium Member
Posts: 739
Joined: Tue Nov 25, 2008 2:19 pm
Location: Malvern, PA

Post by FranklinE »

Setting the default attribute can mask the problem. Remove it and see if you get a warning or fatal message.

The important attribute at both the file and column levels is Allow all zeroes. This should be set to "yes" at least for the packed decimal column.

If you can't get it to work from that, please post the column attributes.

FAQ for handling EBCDIC (mainframe) data: viewtopic.php?t=143596
Franklin Evans
"Shared pain is lessened, shared joy increased. Thus do we refute entropy." -- Spider Robinson

Using mainframe data FAQ: viewtopic.php?t=143596 Using CFF FAQ: viewtopic.php?t=157872
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

srds2 wrote:It is PIC 9(11) COMP-3.
Thanks. Just wanted to see if it was signed and that it was actually a COMP-3. Sometimes people have odd definitions of what "packed decimal" means. :wink:
-craig

"You can never have too many knives" -- Logan Nine Fingers
srds2
Premium Member
Premium Member
Posts: 66
Joined: Tue Nov 29, 2011 6:56 pm

Post by srds2 »

Thanks Chulett and Franklin. I have tried with "Allow All zeroes" to yes after going through few posts in the forum but it didnt work.

When I have removed the default value it was giving me the error saying that "Field has Import error and no default value" so I tried with default value.

and one more thing I could see that is whatever the default value I am setting (3 or 4 anything) for this column that value I am getting (if it 0 then I am able to see all 0s in that column 00000000000- if i set it as 3 then it is 00000000003). though there are proper values in that column from the file it was showing the default values as if there are nulls coming in. Below are the column attributes:
General Attributes -
Native type: DECIMAL
Length: 11
Scale:0
Nullable= yes
Extended Attributes:
Level number: 02
Usage: COMP-3
Default: 0
Derived Attributes:
SQL Type: Decimal
Storage Length: 6
Picture: PIC 9(11) COMP-3
FranklinE
Premium Member
Premium Member
Posts: 739
Joined: Tue Nov 25, 2008 2:19 pm
Location: Malvern, PA

Post by FranklinE »

The last thing I can think of not shown so far is what your EBCDIC data looks like. Things to look for are the location of the sign half-byte -- even unsigned COMP-3 reserves the half-byte and places hex 'F' in it -- and verification of the starting position and length of the EBCDIC field.

Post what you can. You have a real puzzle, no doubt about it.
Franklin Evans
"Shared pain is lessened, shared joy increased. Thus do we refute entropy." -- Spider Robinson

Using mainframe data FAQ: viewtopic.php?t=143596 Using CFF FAQ: viewtopic.php?t=157872
Nsg
Premium Member
Premium Member
Posts: 37
Joined: Thu Jan 26, 2006 1:21 pm

Post by Nsg »

Is it a signed field - if so, you mat have to specify that.
srds2
Premium Member
Premium Member
Posts: 66
Joined: Tue Nov 29, 2011 6:56 pm

Post by srds2 »

Thanks Franklin for the information.

MainFrame team has transferred the files in BINARY format with XLATE=NO parameter, I was able to read the file but when I try to read Header and Detail in one CFF stage I am getting wrong data in the Header record but Detail records are being read properly.
When I read Header and Detail in two CFF stages then I am getting proper values.

Can you please let me know on what basis I should choose the Byte order? (Native Endian/Big Endian/Little Endian) and how to find out whether the data is signed or unsigned and which byte is the Sign value.

Thanks again for your help in reslving these issues.
FranklinE
Premium Member
Premium Member
Posts: 739
Joined: Tue Nov 25, 2008 2:19 pm
Location: Malvern, PA

Post by FranklinE »

You have two separate issues here.
srds2 wrote:MainFrame team has transferred the files in BINARY format with XLATE=NO parameter, I was able to read the file but when I try to read Header and Detail in one CFF stage I am getting wrong data in the Header record but Detail records are being read properly.
When I read Header and Detail in two CFF stages then I am getting proper values.
This could be one of several problems. I'm not familiar with the XLATE parameter, so I can't comment on it. The first thing I'd look for in the bad header data is that you have the correct copybook for it. It appears that the detail copybook is good, so you may need to find differences between the jobs rather than something wrong in the data. You still could have some sort of coding error in the single CFF. Keep looking for a difference there.
srds2 wrote:Can you please let me know on what basis I should choose the Byte order? (Native Endian/Big Endian/Little Endian) and how to find out whether the data is signed or unsigned and which byte is the Sign value.
Byte order is platform specific. Standard (as in most common) implementation is big endian with the sign in the last half-byte in storage. You need your mainframe group to verify this for you, or you need access to the data on the mainframe and the ability to view it in hexadecimal format.

Your PIC clause defines an unsigned packed decimal. You should see x"F" in the sign half-byte of every instance of this field. It's location will tell you the rest.

There is another question I didn't think to ask before: Is the file being transferred from mainframe to Unix first? If so, and you are using CFF to read a sequential file, then you also need to verify that FTP is writing that file correctly with the EBCDIC character set.
Franklin Evans
"Shared pain is lessened, shared joy increased. Thus do we refute entropy." -- Spider Robinson

Using mainframe data FAQ: viewtopic.php?t=143596 Using CFF FAQ: viewtopic.php?t=157872
srds2
Premium Member
Premium Member
Posts: 66
Joined: Tue Nov 29, 2011 6:56 pm

Post by srds2 »

Thanks for the information Franklin.

Yes, Mainframe team is sending this file from MainFrame to Unix using NDM. and I have been told that XLATE=NO will transfer the data without conversion from EBCDIC to ASCII so data will be transferred in EBCDIC format.

I will check on the CFF stage settings of the Header and Detail records and their layouts.

Thanks.
PaulVL
Premium Member
Premium Member
Posts: 1315
Joined: Fri Dec 17, 2010 4:36 pm

Post by PaulVL »

Did you by chance notice if this error started happening after applying FP2?

My site is getting the same thing in our DEV (after FP2 + RU5). Our QA which is not patched yet reflects COMP-3 column values.

I opened a PMR on it today.
PaulVL
Premium Member
Premium Member
Posts: 1315
Joined: Fri Dec 17, 2010 4:36 pm

Post by PaulVL »

Found this little gem, applied it to my environment and it fixed my issue. you might also try it.


http://www-01.ibm.com/support/docview.w ... wg21626867
srds2
Premium Member
Premium Member
Posts: 66
Joined: Tue Nov 29, 2011 6:56 pm

Post by srds2 »

Thank you all for the information.

Not sure if the Fix pack is causing the problem because we use Fix Pack3 and when I read header and Detail separately (using two CFF stages) I am able to read the data properly. so I am thinking it might not be the problem with the Environment.

When I try reading Header and Detail using one CFF stage (both layouts are of same record length) Ia m getting Junk data like "Invalid(few characters) in all the columns and the same invalid() in all the rows as well.. not sure why I am getting this junk data. Can anyone help me in identifying the problem.
FranklinE
Premium Member
Premium Member
Posts: 739
Joined: Tue Nov 25, 2008 2:19 pm
Location: Malvern, PA

Post by FranklinE »

Keep looking for differences between using the one CFF (bad) and two CFF (good). It may not help, but try examining the compiled OSH code in each and maybe something will stand out.

At this point, a detailed view of the code is important. I don't know how much you can post here (I have strict restrictions on what I can show), but anything more could help us help you.
Franklin Evans
"Shared pain is lessened, shared joy increased. Thus do we refute entropy." -- Spider Robinson

Using mainframe data FAQ: viewtopic.php?t=143596 Using CFF FAQ: viewtopic.php?t=157872
Post Reply