problem viewing the output of cff stage

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

anandsh16
Premium Member
Premium Member
Posts: 17
Joined: Tue Dec 12, 2006 3:34 am

problem viewing the output of cff stage

Post by anandsh16 »

Hi

we are using a cff stage for reading a mainframe file which is sent in ASCII format.

We are using the below properties in CFF stage
byte order:native endian
character set=EBCDIC
data format=text
record delimiter=\n

While viewing or writing the output to a file, we see only junk characters like ??????????? in all the columns

Any idea what could be the cause of this
PhilHibbs
Premium Member
Premium Member
Posts: 1044
Joined: Wed Sep 29, 2004 3:30 am
Location: Nottingham, UK
Contact:

Post by PhilHibbs »

If it's sent in ASCII format, why are you defining it as EBCDIC?
Phil Hibbs | Capgemini
Technical Consultant
anandsh16
Premium Member
Premium Member
Posts: 17
Joined: Tue Dec 12, 2006 3:34 am

Post by anandsh16 »

PhilHibbs wrote:If it's sent in ASCII format, why are you defining it as EBCDIC?
There are many columns with values like "{0000000{00{00{00{00{00{00{00{00{00{00{0000000000000000{0000000000000000{0000000000000000{0000000000000000{0000000000000000{0000000000000000"

Aren't these EBCDIC values.So we have defined the character set as EBCDIC and data format as Text
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

No, that's not EBCDIC.
-craig

"You can never have too many knives" -- Logan Nine Fingers
anandsh16
Premium Member
Premium Member
Posts: 17
Joined: Tue Dec 12, 2006 3:34 am

Post by anandsh16 »

PhilHibbs wrote:If it's sent in ASCII format, why are you defining it as EBCDIC?
There are many columns with values like "{0000000{00{00{00{00{00{00{00{00{00{00{0000000000000000{0000000000000000{0000000000000000{0000000000000000{0000000000000000{0000000000000000"

Aren't these EBCDIC values.So we have defined the character set as EBCDIC and data format as Text
anandsh16
Premium Member
Premium Member
Posts: 17
Joined: Tue Dec 12, 2006 3:34 am

Post by anandsh16 »

Chulet

Would you please throw some light what kind of data is this ?
PhilHibbs
Premium Member
Premium Member
Posts: 1044
Joined: Wed Sep 29, 2004 3:30 am
Location: Nottingham, UK
Contact:

Post by PhilHibbs »

Looks like Decimal values, with a Signed property of "No (overpunch), leading"
Phil Hibbs | Capgemini
Technical Consultant
anandsh16
Premium Member
Premium Member
Posts: 17
Joined: Tue Dec 12, 2006 3:34 am

Post by anandsh16 »

PhilHibbs wrote:Looks like Decimal values, with a Signed property of "No (overpunch), leading"
Phil

Can we use a cff stage for this or use a sequential file instead
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Yes, you should be able to use the CFF for this, provided you know the data types. Did you try ASCII yet, do things look better when you do?
-craig

"You can never have too many knives" -- Logan Nine Fingers
PhilHibbs
Premium Member
Premium Member
Posts: 1044
Joined: Wed Sep 29, 2004 3:30 am
Location: Nottingham, UK
Contact:

Post by PhilHibbs »

I don't know. If you can't, you could pass the whole string through to a Column Export stage and split it up there, as that stage has the data format definition settings that you need.
Phil Hibbs | Capgemini
Technical Consultant
anandsh16
Premium Member
Premium Member
Posts: 17
Joined: Tue Dec 12, 2006 3:34 am

Post by anandsh16 »

chulett wrote:Yes, you should be able to use the CFF for this, provided you know the data types. Did you try ASCII yet, do things look better when you do?
if we use a CFF stage, then how to specify the decimal properties which phil suggested, no overpunch and leading .I dont see such options in a cff stage.Moreover the columns comes with a type as display numeric.Please advise
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

They're part of the Packed classification of the Decimal data type, which includes zoned and overpunch and which allows you to specify whether the 'sign' is leading or trailing.
-craig

"You can never have too many knives" -- Logan Nine Fingers
anandsh16
Premium Member
Premium Member
Posts: 17
Joined: Tue Dec 12, 2006 3:34 am

Post by anandsh16 »

chulett wrote:They're part of the Packed classification of the Decimal data type, which includes zoned and overpunch and which allows you to specify whether the 'sign' is leading or trailing.
I am now able to see the output using below option
character set =ASCII
Data format=Text
record delimeter = \n
byte order =native endian
All columns are getting values same as the source except the display numerics.
e.g.one column which is defined as Source display numeric has a value as 00A in source file. I think this should be displayed as 001 in datastage as per the conversion. However the output in view file is 000.

I am not able to sepcify the setting suggested by Phil (overpunch and sign= trailing) as i do not find that option in cff stage.Please suggest
Mike
Premium Member
Premium Member
Posts: 1021
Joined: Sun Mar 03, 2002 6:01 pm
Location: Tampa, FL

Post by Mike »

Common problem. You can not do a simple EBCDIC to ASCII conversion on a COBOL file that contains DISPLAY NUMERIC data items. Typically the last byte of a DISPLAY NUMERIC is a binary byte containing the last digit and a sign nybble in each half word. A hex C1 binary represents a +1 digit (C = positive sign, 1 = digit 1). It is the character "A" in EBCDIC. When an EBCDIC to ASCII conversion is incorrectly applied the C1 gets translated to an ASCII character "A" which has a hex value of 41 (completely loses the binary meaning stored in each nybble).

Since the DISPLAY NUMERIC items in your file have been corrupted, you will have to reverse the corruption. Fortunately, it is possible since there are only 20 possibilities. You can do this in stage variables if you have just 1 or 2 to fix; otherwise, a custom routine is probably in order. I created one of these for a client that could not go back and get their file with DISPLAY NUMERIC items resent in binary format.

Your best bet is to have the file resent in binary.

Mike
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

I don't believe that anything is corrupted here. I get the impression that they just mean an "ascii" versus "binary" ftp transfer which is all about converting line terminators or not, got nothing to do with characterset conversions. Now, if they used a utility to convert the file from EBCDIC to ASCII, that's another animal but would only be a problem for packed fields, as those are neither ASCII nor EBCDIC and you are correct that any such conversion would corrupt them. Fields in display format, however, do actually need to be converted.

"00A" is a perfectly valid signed overpunch value, and yes the "A" represents a positive 1 so what that should turn into is "001+" or just "001" as you noted. It all comes down to your lack of proper settings in the stage, you need to find the settings that Phil and I have been mentioning and get them set properly for those fields.
-craig

"You can never have too many knives" -- Logan Nine Fingers
Post Reply