Page 1 of 1

Complex Flat Files adding extra row

Posted: Thu Sep 27, 2007 7:23 am
by JezT
I have a process that reads an ASCII file in a Complex Flat File stage and passes it through a transformer and loads into a DB2 Table.

Within the CFF, the Record Style is set to CR/LF and this causes an extra row of spaces to be added on the end of the file. When the process runs, it errors with this last row as it states the "The syntax of the string representation of a datetime value is incorrect."

It appears to only error when the fields involved are non character fields in the source and target. However, if the fields are Dates, Integers, Decimals then this error is present.

Is there a way of removing this blank row before reaching the transformer stage ? Is there a way within the CFF stage to tell DS to ignore the final row of the file ?

Cheers
JezT

Posted: Thu Sep 27, 2007 4:06 pm
by ray.wurlod
Check whether there's an extra line terminator at the end of the file. If so, pre-process the file (for example with sed) to remove it. Examples have been posted in the past, for which you can search.

Posted: Thu Sep 27, 2007 6:08 pm
by flashgordon
Yes,

Don't be afraid of your operating system. If it's Unix go in with view and you should be able to see what's going on at the point in the file you are having a problem.

In terms of chucking blank lines maybe a preprocessing step. If you are using CR/LF they are ascii files. So you could preprocess with a Sequential file Stage, chuck the bad line(s) and then give the file to CFF. CFF would have a hard time throwing a bad line out. You have positive bytes in your declaration and suddenly it's getting a row that's very short and doesn't have the columns it expects. I don't think cff will throw it out.

... Flash

Posted: Fri Nov 02, 2007 7:50 pm
by flashgordon
Hi,

I want to add something important to this thread. This is embarrassing because I told this person they probably had a blank line in their file. Peoplesoft Support has told me that it is a known problem in 7.5.1.x that it adds a blank bogus row to the end of an input file in CFF stage. We were working with CFF stage reading ascii files with a cobol copybook. I don't know if it has the same problem reading an ebcdic file. But the person reporting this problem wasn't smoking something. This is a bug. Peoplesoft support didn't say if it was fixed in Hawk.

... Tom

Posted: Tue Dec 11, 2007 4:02 pm
by flashgordon
Hi,

If cff stage is adding an extra row with null values I found a way below to remove it.



This is what I found worked. Remember that nulls for a fixed column input file are different than for delimited data. The field has width, even it if is filled with binary zeroes. The trick that keeps that last null row out of Oracle is shown below:

1) In CFF stage do NOT check the Preserve Nulls box,

2) Then put the line below on the constraint that goes to your output PS_NW_... Output Table.

IncomingFile.GL_HYP_DBAL_ACCT <> char(0):char(0):char(0):char(0):char(0)

(for your problem you have to add as many char(0): as your column is wide.)

This show n rows coming in from CFF stage and n-1 rows going to Oracle.

To make the constraint line above easier to understand, this is what it is doing:

IncomingFile.GL_HYP_DBAL_ACCT <> x'0000000000' # the input column is compared to a 5 byte field filled with binary zeroes. X stands for hex