Warning - Import consumed only n bytes of record

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
raji33
Premium Member
Premium Member
Posts: 151
Joined: Thu Sep 23, 2010 9:21 pm
Location: NJ

Warning - Import consumed only n bytes of record

Post by raji33 »

Hi,

I am getting below warning. Any suggestions?

Import consumed only 205bytes of the record's 208 bytes (no further warnings will be generated from this partition)

Thanks
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

First suggestion would be to do an exact search for "Import consumed only" to find other discussions on this issue. See if any of them help.
-craig

"You can never have too many knives" -- Logan Nine Fingers
raji33
Premium Member
Premium Member
Posts: 151
Joined: Thu Sep 23, 2010 9:21 pm
Location: NJ

Post by raji33 »

Hi Chullett,

I have tried possible ways but still i see the warning. i see both of these warnings together, no idea what the problem is. Is there a way that we can check record 248719?

Import consumed only 205bytes of the record's 208 bytes (no further warnings will be generated from this partition)
Import warning at record 248719.
asorrell
Posts: 1707
Joined: Fri Apr 04, 2003 2:00 pm
Location: Colleyville, Texas

Post by asorrell »

There's almost always a way... But we don't know a whole lot about what you are trying to do. You didn't even say the data source you are reading from.

Is it a sequential file? A database table? Does the same record number always bounce or does it change? Do you have NLS characters in your data source? Are you using standard or NLS definitions for your columns?

Put the effort in to describe the problem thoroughly, you'll have a better chance of someone knowing what's wrong.
Andy Sorrell
Certified DataStage Consultant
IBM Analytics Champion 2009 - 2020
raji33
Premium Member
Premium Member
Posts: 151
Joined: Thu Sep 23, 2010 9:21 pm
Location: NJ

Post by raji33 »

source is sequential file. Every time when i run it show a different record . No NLS characters from source. Basically my job has 4 sequential files funneld to write to a sequential file as below.

Code: Select all

seq-----\

seq-----\

seq-----\funnel---------trans---------seq

seq-----/
asorrell
Posts: 1707
Joined: Fri Apr 04, 2003 2:00 pm
Location: Colleyville, Texas

Post by asorrell »

Unless your source files are changing, the import record number from the sequential file shouldn't change from run to run. The error message will come from a particular sequential stage so it should be easy to identify which file has the problem.

Then you can use UNIX utilities (head, tail, vi, cat) to identify the bad line and determine what is wrong with it. It is most probably an NLS character or some control characters that are invisible in the file. You'd have to use vi's "set list" or cat -Tv (or cat -tv depending on system) to show the invisible characters.

If you have difficulty with that, setup a test program that only reads one file. Have it read every record into a single varchar field that is big enough to hold the entire record (including larger ones). Then use a transformer to test the length of the field and send "bad" records with an incorrect length to a separate file so you can look at them.

If the import message is coming from the funnel stage, re-post here, that's a different issue.
Andy Sorrell
Certified DataStage Consultant
IBM Analytics Champion 2009 - 2020
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Typically you see that message on a fixed-width file, is that what you have? Short answer (which the search should have turned up) is your metadata doesn't match with the actual data in the file, specifically it is expecting all of your records to be 208 bytes in length and it found one that ended after 205 bytes.

Other than what Andy posted, something I've seen in the past are problems with 'optional' fields at the end of fixed-width records when the files are ftp'd onto the ETL server. Any chance your last field is three bytes and can (on occassion) be spaces? FTP is known to strip those "extra" spaces from the end at times. Perhaps something like that is happening.
Last edited by chulett on Wed Aug 07, 2013 3:12 pm, edited 1 time in total.
-craig

"You can never have too many knives" -- Logan Nine Fingers
raji33
Premium Member
Premium Member
Posts: 151
Joined: Thu Sep 23, 2010 9:21 pm
Location: NJ

Post by raji33 »

Thanks All. Yes there was a bad data from source which is not matching the metadata.
Post Reply