Warning while reading from Sequential File.

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
thebird
Participant
Posts: 254
Joined: Thu Jan 06, 2005 12:11 am
Location: India
Contact:

Warning while reading from Sequential File.

Post by thebird »

Hi,

I am running a job which reads data from a sequential file and then after processing loads a dataset. The job does run to completion, but throws a warning which says :

InputSfSrc,0: Import consumed only 505bytes of the record's 506 bytes (no further warnings will be generated from this partition)

InputSfSrc,0: Import warning at record 0.

All the rows from the input are being read and are being transferred to the target.

Has anyone come across this issue before? Any idea why this is happening and how it can be handled?

Thanks in advance.

Regards,

The Bird.
roy
Participant
Posts: 2598
Joined: Wed Jul 30, 2003 2:05 am
Location: Israel

Post by roy »

Hi,
Have you got a 505 characters limit for the row data?
Do you read the row as a single column?
Can you verify the 506th character get lost in the process?
Could it be a row delimiter issue?

IHTH,
Roy R.
Time is money but when you don't have money time is all you can afford.

Search before posting:)

Join the DataStagers team effort at:
http://www.worldcommunitygrid.org
Image
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

thebird,

if this error is only issued once, it might be that the last line in the file contains neither a record or file delimiter.
thebird
Participant
Posts: 254
Joined: Thu Jan 06, 2005 12:11 am
Location: India
Contact:

Post by thebird »

Hi Roy,

Got the issue resolved. It was because I had given the Record Delimiter as a UNIX newline, and the source file was not a unix file.

My 505th character is a quote and that is where the record ends.

Thnx for a quick response.

Regards,

The Bird.
roy wrote:Hi,
Have you got a 505 characters limit for the row data?
Do you read the row as a single column?
Can you verify the 506th character get lost in the process?
Could it be a row delimiter issue?

IHTH,
Post Reply