ERROR Reading source csv file

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
h4harry1
Participant
Posts: 16
Joined: Sat Mar 19, 2011 8:01 am

ERROR Reading source csv file

Post by h4harry1 »

I m trying to read a huge CSV file. Got following error. Any help how to fix this.

Please suggest what settings i need to change.

##E IIS-DSEE-TOIX-00158 23:00:10(000) <Sequential_File_Read,0> Error reading on import.
>##E IIS-DSEE-TFRS-00061 23:00:10(001) <Sequential_File_Read,0> Consumed more than 100,000 bytes looking for record delimiter; aborting
>##E IIS-DSEE-TOIX-00179 23:00:10(002) <Sequential_File_Read,0> Import error at record 0.
>##E IIS-DSEE-TFOR-00089 23:00:10(003) <Sequential_File_Read,0> The runLocally() of the operator failed.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

All we can tell from that is the record delimiter you used is wrong, as in doesn't match what's in the source file.
-craig

"You can never have too many knives" -- Logan Nine Fingers
h4harry1
Participant
Posts: 16
Joined: Sat Mar 19, 2011 8:01 am

Post by h4harry1 »

My source file is very huge contains nearly 50000 records with around 100 Columns. If i give only 500 records in the source file then the job runs fine. So something causing problem when i use complete file.

Any help ?
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Same answer but it could be specific to a portion of the file, as in the file is corrupted. Or perhaps whatever you are doing to create the smaller file adds the appropriate delimiter. There's no way for us to know.

We can't see your file and you haven't shared any of the settings you are using, so not much anyone can do but guess.
-craig

"You can never have too many knives" -- Logan Nine Fingers
Post Reply