Hi,
in production while running the job i am getting below fatal errors
1.consumed more than 100000 bytes looking for record delimeter; aborting (end delimeter is missed in Nth record)
2.Import error at record 1892716.
actually we r reading data from 15k files, import is failed at 1892716(due to delimeter missing)
how to find this record in which file
Nth record import is failed due to delimeter missing in file
Moderators: chulett, rschirm, roy
Are you using a file pattern to select the files to read? Try setting $APT_IMPORT_PATTERN_USES_FILESET=1...this will have DataStage read the files individually as a fileset instead of as a cat'ed collection.
You can also perform some straightforward file validation prior to the DataStage processing.
How are the files delimited? DOS/Windows line terminator? If the SeqFile stage is looking for that and the file was accidentally created with Unix terminators you would get this error. This happens a lot when moving files between different O/S platforms.
Regards,
You can also perform some straightforward file validation prior to the DataStage processing.
How are the files delimited? DOS/Windows line terminator? If the SeqFile stage is looking for that and the file was accidentally created with Unix terminators you would get this error. This happens a lot when moving files between different O/S platforms.
Regards,
- james wiles
All generalizations are false, including this one - Mark Twain.
All generalizations are false, including this one - Mark Twain.