Hi,
I am running a job which reads data from a sequential file and then after processing loads a dataset. The job does run to completion, but throws a warning which says :
InputSfSrc,0: Import consumed only 505bytes of the record's 506 bytes (no further warnings will be generated from this partition)
InputSfSrc,0: Import warning at record 0.
All the rows from the input are being read and are being transferred to the target.
Has anyone come across this issue before? Any idea why this is happening and how it can be handled?
Thanks in advance.
Regards,
The Bird.
Warning while reading from Sequential File.
Moderators: chulett, rschirm, roy
Hi,
Have you got a 505 characters limit for the row data?
Do you read the row as a single column?
Can you verify the 506th character get lost in the process?
Could it be a row delimiter issue?
IHTH,
Have you got a 505 characters limit for the row data?
Do you read the row as a single column?
Can you verify the 506th character get lost in the process?
Could it be a row delimiter issue?
IHTH,
Roy R.
Time is money but when you don't have money time is all you can afford.
Search before posting:)
Join the DataStagers team effort at:
http://www.worldcommunitygrid.org
![Image](http://www.worldcommunitygrid.org/images/logo.gif)
Time is money but when you don't have money time is all you can afford.
Search before posting:)
Join the DataStagers team effort at:
http://www.worldcommunitygrid.org
![Image](http://www.worldcommunitygrid.org/images/logo.gif)
Hi Roy,
Got the issue resolved. It was because I had given the Record Delimiter as a UNIX newline, and the source file was not a unix file.
My 505th character is a quote and that is where the record ends.
Thnx for a quick response.
Regards,
The Bird.
Got the issue resolved. It was because I had given the Record Delimiter as a UNIX newline, and the source file was not a unix file.
My 505th character is a quote and that is where the record ends.
Thnx for a quick response.
Regards,
The Bird.
roy wrote:Hi,
Have you got a 505 characters limit for the row data?
Do you read the row as a single column?
Can you verify the 506th character get lost in the process?
Could it be a row delimiter issue?
IHTH,