Page 1 of 1

file failure

Posted: Mon Apr 20, 2009 7:09 am
by deesh
Hi Friends,

I am phasing below failure.

SEQ_SAP_FILE,0: Consumed more than 100000 bytes looking for record delimiter; aborting

Posted: Mon Apr 20, 2009 7:14 am
by ShaneMuir
Look at the delimiter on the file. The message is saying that it can't find it.

Posted: Mon Apr 20, 2009 7:19 am
by chulett
Your metadata is incorrect and it's saying it can't find the record delimiter you said would be there. As noted, correct that.

Posted: Mon Apr 20, 2009 8:01 am
by deesh
chulett wrote:Your metadata is incorrect and it's saying it can't find the record delimiter you said would be there. As noted, correct that. ...

I have created the file in the previous job without using any record delimiter. I am running the jobs on windows 2003 server.

The said job was running fine previously. Is it possible that somebody changed some Admin parameter and then this job is giving this error.

Please help.

Posted: Mon Apr 20, 2009 8:43 am
by chulett
Nope. So it's a fixed-width file? Or did you mean you used the default delimiter in the creation job? Regardless, you need to match the definitions in the stage to the actual file, nothing more or less will work.

Posted: Mon Apr 20, 2009 11:44 am
by dsuser_cai
When you create a file you deliberately specify a delimeter (may be a pipe symbol, or some other) and use the same delimiter in the job that read the file. Some times the data source might have the defaut delimiter that you use, and this might cause some problem,
For example if the source data has a column called COL1 as

COL1
8787,9870
233,488
823723,089873

in the above case if you use the delimiter as a comma, then DS will push the data after the comma to the next column considering that as a new field. so always you specify a seperate (standerd) delimiter. Try this and let us know if you need further assistance.