CFF Issue

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

g_rkrish
Participant
Posts: 264
Joined: Wed Feb 08, 2006 12:06 am

CFF Issue

Post by g_rkrish »

Hi,

I have weired results on the CFF stage with server and parallel jobs..I used the CFF stage in Parallel job it gives right result.when i was trying to laod the same file using server in the CFF stage i see some junk data on the particular field in in some columns there are unmatched data between the parrallel CFF and server CFF.I could not understand wh is going on..I have the record style as CR/LF..Can any one help me out with this...

Thanks,
RK
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

RK - I'll trade you answers. If you can tell me what the blinking red light on my car dashboard (the one in the top left corner) means and how to fix it, I'll help you with the CFF issue.

p.s. In case the post above was too subtle - please add some more information to your problem description; i.e. which column(s) and datatype(s).

It also helps if you could narrow down the issue to where it is simple; i.e. remove all columns that are interpreted correctly from a copy of both the data and metadata.
g_rkrish
Participant
Posts: 264
Joined: Wed Feb 08, 2006 12:06 am

Post by g_rkrish »

ArndW wrote:RK - I'll trade you answers. If you can tell me what the blinking red light on my car dashboard (the one in the top left corner) means and how to fix it, I'll help you with the CFF issue.

p.s. In case the post above was too subtle - please add some more information to your problem description; i.e. which column(s) and datatype(s).

It also helps if you could narrow down the issue to where it is simple; i.e. remove all columns that are interpreted correctly from a copy of both the data and metadata.
Hey Arnd,

Here are the source columns that are having problems....

Source column meta data


03 ARS-NAME PIC X(30). Interpetd as character(30)(CFF)

target metdata Varchar(30)....

and like five columns with metadata


03 ARS-CUMULATIVE-DRAW PIC S9(9)V99 COMP-3.Imported as Decimal(11,2) (CFF))

Pls lemme know if you need some more info...
RK
horserider
Participant
Posts: 71
Joined: Mon Jul 09, 2007 1:12 pm

Post by horserider »

Please post your COMPLETE FILE DEFINITION of the SOURCE. I recently came across COMP fields in Mainframe and have solved the issue using the server job.
g_rkrish
Participant
Posts: 264
Joined: Wed Feb 08, 2006 12:06 am

Post by g_rkrish »

horserider wrote:Please post your COMPLETE FILE DEFINITION of the SOURCE. I recently came across COMP fields in Mainframe and have solved the issue using the server job.
Here is my complete file defintion....

01 ARFSLSMN-RECORD.
02 ARFSLSMN-KEY.
03 ARS-COMPANY PIC X(3).
03 ARS-DIVISION PIC 9(4) COMP-3.
03 ARS-SLSMN-NO PIC 9(4) COMP-3.
03 ARS-CATEGORY-CLASS PIC X(4).
02 ARS-REST-OF-RECORD.
03 ARS-NAME PIC X(30).
03 ARS-COMM-PCNT PIC 99V99 COMP-3.
03 ARS-COMM-DLRS OCCURS 3 TIMES
PIC S9(7)V99 COMP-3.
03 ARS-YTD-EARNINGS PIC S9(7)V99 COMP-3.
03 ARS-SPECIAL-SLC-FLAG PIC X.
03 ARS-SALES-QUOTA OCCURS 3 TIMES
PIC S9(9) COMP-3.
03 ARS-USE-DRAW PIC X.
03 ARS-DRAW-AMOUNT PIC S9(7)V99 COMP-3.
03 ARS-CUMULATIVE-DRAW PIC S9(9)V99 COMP-3.
03 ARS-CUMULATIVE-COMM PIC S9(9)V99 COMP-3.
03 ARS-TOTAL-SALES OCCURS 3 TIMES
PIC S9(9)V99 COMP-3.
03 ARS-TOTAL-COST OCCURS 3 TIMES
PIC S9(9)V99 COMP-3.
03 ARS-GP-QUOTA OCCURS 3 TIMES
PIC S9(9) COMP-3.

03 FILLER PIC X(15).
01 ARFSLSMN-CATEGORY-RECORD.
02 FILLER PIC X(13).
02 ARS-REST-OF-REC.
03 ARS-DIRECT-WH OCCURS 2 TIMES.
04 SALESMAN-CATEGORY-SALES OCCURS 3 TIMES.
05 ARS-SALES PIC S9(9)V99 COMP-3.
05 ARS-COST PIC S9(9)V99 COMP-3.
03 FILLER PIC X(81).
RK
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

The PIC X column, ARS-NAME, shouldn't cause problems. If it does, it probably means that your record length is off and binary information is being parsed as part of that string, most likely your definition is shorter than the real data and the contents of the subsequent COMP-3 field are coming into that field.
Are you using the same copybook metadata for both PX and Server ?
g_rkrish
Participant
Posts: 264
Joined: Wed Feb 08, 2006 12:06 am

Post by g_rkrish »

ArndW wrote:The PIC X column, ARS-NAME, shouldn't cause problems. If it does, it probably means that your record length is off and binary information is being parsed as part of that string, most likely your definition is shorter than the real data and the contents of the subsequent COMP-3 field are coming into that field.
Are you using the same copybook metadata for both PX and Server ?
Yes am using same copy book and same source file.....
RK
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Odd - co you have one import of the copybook into your DataStage metadata and are inserting that definition into both jobs?
g_rkrish
Participant
Posts: 264
Joined: Wed Feb 08, 2006 12:06 am

Post by g_rkrish »

ArndW wrote:Odd - co you have one import of the copybook into your DataStage metadata and are inserting that definition into both jobs? ...
No, i have deleted the Server job and i tried the same definition with Parallel job.....
RK
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Try converting all of your COMP fields to correctly sized PIC(x) columns and read the data, see if the contents of COMPANY and DIVISION remain correct - in that case you have correct length. Then change one COMP-3 at a time and see if the conversions work.
g_rkrish
Participant
Posts: 264
Joined: Wed Feb 08, 2006 12:06 am

Post by g_rkrish »

ArndW wrote:Try converting all of your COMP fields to correctly sized PIC(x) columns and read the data, see if the contents of COMPANY and DIVISION remain correct - in that case you have correct length. Then chan ...
It is showing still the same garbge......
RK
shamshad
Premium Member
Premium Member
Posts: 147
Joined: Wed Aug 25, 2004 1:39 pm
Location: Detroit,MI

Post by shamshad »

OK, I will send you a word document or paste the content here tomorrow morning :)
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

If the content is still the same garbage then you have a record length issue you need to fix first.
shamshad
Premium Member
Premium Member
Posts: 147
Joined: Wed Aug 25, 2004 1:39 pm
Location: Detroit,MI

Post by shamshad »

First, take the COBOL FILE DEFINITION and load it in METADATA, this will convert the PIC X, PIC 9 COMP fields to ASCII type.

One Very Important thing to notice here after importing CFD in Metadata, check the conversion. For example: for a PIC S9(9) USAGE COMP1, Datastage converts to INTEGER(9) whereas for SERVER JOB this has to be manually changed to INTEGER(4). Check conversion for other COMP like COMP2, COMP3 based on how many bytes they should actually be.

(1)

NLS MAP at SOURCE SHOULD BE "NONE"

Design a SERVER job, load the source column from imported CFD definition. Once you load this all the COMP fields will automatically converted to proper INTEGER types. If you are pulling the mainframe file through FTP Plug in, make sure you have these as mentioned

Data Rep : Binary
Check data against : YES
Line Term. : No Termination
Fixed width column : Yes

REST AS DEFAULTS

Drag a transformer and a text file as target (just for testing). For each of the source column use proper SERVER FUNCTION for convert the COLUMNS from BINARY to ASCII. For example :-

for PIC X columns Use DataTypeEbcdicToAscii()
for COMP1 columns Use DataTypePicComp()

DUMP the columns to TEXT file. If you are able to see the value in the target file, that means the source file and the byte for each column is defined properly.

(2) PARALLEL JOB:

FTP EnterPrise Plugin: Transformer: Text file

In source load the CFD definition that was imported. In transformer, NO function to convert the data.

Transfer Type : Binary
Record Length : Fixed (Record Level)
Delimiters : None (Field default)

In the target TEXT file, FORMAT TAB, select "EXPORT EBCDIC as ASCII"
Test with Decimal PACKED=YES, else remove this option.

Once you run this job and "VIEW" the data through datastage, you should be able to see the converted values. If you do, design and second job where
TEXT FILE (file on first step) > Transformer > Table


Good Luck !
g_rkrish
Participant
Posts: 264
Joined: Wed Feb 08, 2006 12:06 am

Post by g_rkrish »

shamshad wrote:First, take the COBOL FILE DEFINITION and load it in METADATA, this will convert the PIC X, PIC 9 COMP fields to ASCII type.

One Very Important thing to notice here after importing CFD in Metadata, check the conversion. For example: for a PIC S9(9) USAGE COMP1, Datastage converts to INTEGER(9) whereas for SERVER JOB this has to be manually changed to INTEGER(4). Check conversion for other COMP like COMP2, COMP3 based on how many bytes they should actually be.

(1)

NLS MAP at SOURCE SHOULD BE "NONE"

Design a SERVER job, load the source column from imported CFD definition. Once you load this all the COMP fields will automatically converted to proper INTEGER types. If you are pulling the mainframe file through FTP Plug in, make sure you have these as mentioned

Data Rep : Binary
Check data against : YES
Line Term. : No Termination
Fixed width column : Yes

REST AS DEFAULTS

Drag a transformer and a text file as target (just for testing). For each of the source column use proper SERVER FUNCTION for convert the COLUMNS from BINARY to ASCII. For example :-

for PIC X columns Use DataTypeEbcdicToAscii()
for COMP1 columns Use DataTypePicComp()

DUMP the columns to TEXT file. If you are able to see the value in the target file, that means the source file and the byte for each column is defined properly.

(2) PARALLEL JOB:

FTP EnterPrise Plugin: Transformer: Text file

In source load the CFD definition that was imported. In transformer, NO function to convert the data.

Transfer Type : Binary
Record Length : Fixed (Record Level)
Delimiters : None (Field default)

In the target TEXT file, FORMAT TAB, select "EXPORT EBCDIC as ASCII"
Test with Decimal PACKED=YES, else remove this option.

Once you run this job and "VIEW" the data through datastage, you should be able to see the converted values. If you do, design and second job where
TEXT FILE (file on first step) > Transformer > Table


Good Luck !
thanks for your support.Finnaly i made it work....
RK
Post Reply