Page 1 of 2

Input buffer overrun at field

Posted: Mon Jul 29, 2013 5:15 am
by nitingupta
Hi,

My JOb flow is as Below:
CFF -> Transformer -> Dataset

Source: CFF stage

Output Warnings:
1.CFF_TSE_PCID,0: Import consumed only 196bytes of the record's 199 bytes (no further warnings will be generated from this partition)

2.CFF_TSE_PCID,0: Input buffer overrun at field "cff_tse_pcid_record_type.DETAIL.FORMAT_CD", at offset: 19

column : cff_tse_pcid_record_type.DETAIL.FORMAT_CD = INT defined as Binary[4]

3.Tag value: PDD does not match any tagcase for tagged field "cff_tse_pcid_record_type"


Rows are getting rejected with giving above errors
Please suggest how to remove them.

Posted: Mon Jul 29, 2013 6:27 am
by chulett
Those are all about your metadata in the CFF stage not matching the file you are processing.

1. Your metadata shows a record length of 199 but the record ended at 196.
2. The field mentioned was larger than you metadata allowed.
3. You are processing multiple record types and 'PDD' is not a type you have accounted for.

Posted: Mon Jul 29, 2013 6:47 am
by arunkumarmm
How did you define your metadata in CFF? If you have defined it manually, try importing the actual copybook.

Posted: Mon Jul 29, 2013 6:54 am
by nitingupta
I am using actual copybook only, but still not able to get exact reason why i am getting all this error.

Posted: Mon Jul 29, 2013 7:12 am
by arunkumarmm
Do you have any packed fields? If so, how have you defined them?

And are you trying to read more than one record type out of the same CFF stage? If so, did you try deleting other links and view the data?

What is the NLS map you have defined?

Posted: Mon Jul 29, 2013 10:40 am
by nitingupta
i have defined FORMAT_CD as BINARY [4] USAGE COMP only
its in output its as INT 4, rest all columns are CHAR not aware about the NLS map what it should be..?

Posted: Mon Jul 29, 2013 10:46 am
by nitingupta
i am able to view data but out of 924, 24 records are getting rejected due to above warnings but rest 900 records are going without any warning.

Posted: Mon Jul 29, 2013 10:53 am
by arunkumarmm
nitingupta wrote:i have defined FORMAT_CD as BINARY [4] USAGE COMP only
its in output its as INT 4, rest all columns are CHAR not aware about the NLS map what it should be..?
So that column in copy book is mentioned as Pic 9(4) Comp.?

And NLS, if your source file is in EBCDIC format, you should set it to the EBCDIC map so that it will get converted. Remember comp with above mentioned specification will occupy only two bytes.

Posted: Mon Jul 29, 2013 11:04 am
by nitingupta
i am reading file as ASCII, and yes its PIC S9(4).

Posted: Mon Jul 29, 2013 11:18 am
by arunkumarmm
Pic s9(4) is not the same as Pic 9(4) COMP. And there it is.. It should not be a binary field on an ASCII file. Is there a reason why you are trying to read ASCII file using CFF? And not a sequential file?

Posted: Mon Jul 29, 2013 11:19 pm
by nitingupta
i had used following:
byte order: Native Endian,
Data Format : Binary
Character set: ASCII
column: FORMAT_CD BINARY[4] USAGE COMP, PIC S9(4) COMP

i tried with NLS MAP=ASCL_EBCDIC also but its not working.

Posted: Mon Jul 29, 2013 11:27 pm
by arunkumarmm
You did not answer the important question. Is your file ASCII or EBCDIC ?

Posted: Mon Jul 29, 2013 11:31 pm
by nitingupta
i had used following:
byte order: Native Endian,
Data Format : Binary
Character set: ASCII
column: FORMAT_CD BINARY[4] USAGE COMP, PIC S9(4) COMP

i tried with NLS MAP=ASCL_EBCDIC also but its not working.

Posted: Mon Jul 29, 2013 11:42 pm
by nitingupta
its a file coming from MAinframe job after SFTP and its ascii i ftped this using Normal FTP

Posted: Mon Jul 29, 2013 11:45 pm
by arunkumarmm
So if the file is ASCII, why are you using CFF?