i have a file with 3 million records. I know this is not very huge file for ETL tool to handle. Though i use very simple sequential stage to another seq stage conversion, i faced error, job got aborted.
so i thought this may be file based restriction in operating system. so i tried with FileSet (as we all know which subdivides the file in max of 2GB each).
Now job runs and also could see the data passing but still it aborts.
I could the follwoing errors
Code: Select all
APT_CombinedOperatorController,1: Unsupported read in APT_FileBufferOutput::spillToNextFile(): Bad file number.
APT_CombinedOperatorController,0: Unsupported read in APT_FileBufferOutput::spillToNextFile(): Bad file number.
CIFWORK_CPC,1: Export failed.
CIFWORK_CPC,1: Output file full, and no more output files
CIFWORK_CPC,0: Export failed.
CIFWORK_CPC,0: Output file full, and no more output files
60% starts
CIFWORK_CPC,1: The runLocally() of the operator failed.
Code: Select all
APT_CombinedOperatorController,1: The runLocally() of the operator failed.
APT_CombinedOperatorController,1: Operator terminated abnormally: runLocally did not return APT_StatusOk
CIFWORK_CPC,0: The runLocally() of the operator failed.
Code: Select all
CIFWORK_CPC,0: Output 0 produced 2 records.
CIFWORK_CPC,0: Export complete; 0 records exported successfully, 0 rejected.
APT_CombinedOperatorController,0: The runLocally() of the operator failed.
APT_CombinedOperatorController,0: Operator terminated abnormally: runLocally did not return APT_StatusOk
main_program: Step execution finished with status = FAILED.
Thanks in advance
-kumar