Page 1 of 1

CFF stage issue with OCCURS and blank default values

Posted: Sun Oct 07, 2012 11:11 pm
by vpatel10
We are using -

CFF stage to read EBCDIC file.
Fixed width file.
File has record types header/detail/trailer.
Detail record has column with OCCURS of type PIC S9(3)V9(4) COMP-3
CFF stage 'Records ID' defined for each record type.
CFF stage 'Record options' - Default values defined as '0' for decimal/integer.
This is working ok.

But when we remove Default values '0' for decimal/integer and make it blank, stage fails reading any record.
This works if input file doesn't have any array (OCCURS) defined in it.

How can we keep blank default values and still read file using CFF stage with OCCURS defined in file?

Posted: Mon Oct 08, 2012 12:31 am
by ray.wurlod
You can't. Blank is not a legal value for decimal and integer data types.

You might try changing the data type to Char.

Re: CFF stage issue with OCCURS and blank default values

Posted: Mon Oct 08, 2012 12:57 am
by vpatel10
Thanks Ray. So how can we read null/empty decimal values without specifying any default values in CFF stage?

Posted: Mon Oct 08, 2012 3:46 am
by ray.wurlod
What part of "you can't" was unclear?

Posted: Mon Oct 08, 2012 4:17 pm
by vmcburney
Ray, the problem is that the CFF stage is forcing the use of defaults even if they are not required. Defaults in CFF can be dodgy - defaulting an empty Decimal or integer to 0 can cause trouble - for example default a flag field to 0 when it is blank or empty implies the wrong thing. The CFF is forcing the use of defaults and it is impossible to work out afterwards which rows were defaulted. It is forcing this behaviour due to the way it parses the OCCURS statement. Data without OCCURS is not complaining about missing defaults, data with OCCURS is.

Is it possible to set a decimal value to NULL in the CFF default field instead of 0?

Posted: Tue Oct 09, 2012 1:32 pm
by FranklinE
Default processing on EBCDIC data is always a problem. The mainframe COBOL perspective -- a hard standard that teaches a quick lesson when ignored -- is to only ever default alphanumeric fields to spaces and display (non-packed) numerics to zeroes. Binary storage fields in general follow the standard with a numeric zero as the default, with packed decimal being the sore thumb that always sticks out: the storage format permits a numeric zero in each half-byte, with two consecutive zeros translating to ASCII null but should not be read as such.

The only correct solution is to go back to the process that creates the EBCDIC data and impose the standards, and not tolerate anything else. Otherwise, DataStage jobs will be rife with custom code just to handle them.

Unsolicited advice: when your upstream developers complain that you are making them do more work, you can use my rule-of-thumb: if you make me do it in DataStage, it will cost the project two or three times as much as having them do it. Make sure a manager with budget power is within earshot. Works every time for me. 8)

Posted: Fri Jun 12, 2015 11:02 am
by jzajde1
In the CFF Stage under 'Record Options' Tab there is a default values column. Enter 0.0 as a default.