Hello Forum,
I have a EBCDIC file with packed decimal (COMP-3) fields. My source system is using EBCDIC blanks to encode null for the field (0x40). This makes last nibble of the packed decimal invalid, and DataStage throws a fit when it tries to parse.
Data parses fine on the Server edition CFF Stage, and engine converts the invalid packed decimals to '??????'.
How can I setup the CFF in Parallel job to handle such a situation. In other words, how can I communicate to CFF stage what my null value will be for the packed decimal field?
Is it possible to work around the CFF flaw by using the sequential file stage as a source? If so what would the packed decimal field settings be?
Thank you,
Greg
CFF Stage - Handling packed decimal NULL values
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
There is a qualiification you can make to decimal types, which is "Allow All Zeroes". A packed decimal field completely filled with zeroes is normally illegal; this setting overrides that and allows an all zeroes packed decimal field to be understood to represent 0.0. This may help, although you say your fields have blanks rather than zeroes. Worth a try, in any case.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
I tried the CFF Record option - Allow all zeros. but this didn't work. I still get Complex_Flat_File_0,0: Field "F1_VOLUME" has import error and no default value; data: {@ @ @ @ @}, at offset: 54. Methinks this stage as currently written won't handle hex data (40 40 40) coming in via a column defined as COMP-3. Oh well, the server CFF file will read it and allow me to determine how to handle the data.
JOHN GUILLETTE
MARKHAM ON
MARKHAM ON