I am processing a file that contains some fields with packed data in. For example, when the file is viewed within DS, the values look like the following:
{
5A
00{
M
L
Whereas the values of these fields should be as follows:
0
51
000
-4
-3
Is there a way within DS of converting these packed fields into ASCII ?
since these are binary fields DataStage cannot display them without knowing which conversion to apply. If you look into the routines/functions in the SDK you will find "DataTypePicComp3" which should solve your problem.
that is not exactly what you need to know --> from what language is this field being written and what is the definition of that field in that language. This is kind of like a Rosetta Stone for data...
Once the data type is known you can either use a built-in DS function to decode it or use those existing functions to write your own if it does not exist.
either your data is wrong or your PICture clauses are. The PIC 9 datatype is not packed, it is essentially a CHAR field that contains only digits. Are you doing the EBCDIC to ASCII conversion?
Like ArndW suggested, you must do an EBCDIC to ASCII conversion using Ascii( ) function, before attempting to even view the data. If you have a packed field, other than EBCDIC to ASCII conversion, you have to make sure that even the storage length is correct, as the storage length changes when you unpack that field.
a hint..maybe useful maybe not..Every Mainframe file would be neccessarily accompanied with the cobol file defination. So looking into the .cfd file will help you analyse the data defination associated with the Respective column names. It might be that you are associating the defination with other fields. without trying to view the data then you can use the proper length and routines for conversion. otherwise i guess it should work as stated in the previous posts if the defination is as you have mentioned .