I have a job which reads from a sequential file and writes to a fileset.
This job runs fine when my input sequential file has 38 million records ( 3 GB in size)
However when my input file grows big to 140 million ( 11 GB) in size, my job gives the following error.
File_Set_11,0: Bad read in APT_FileBufferOutput::spillToNextFile(): Bad file numberI
Any inputs for resolving this would be appreciated
Thanks,
Help Needed:Writing to Fileset gives error
Moderators: chulett, rschirm, roy
-
- Charter Member
- Posts: 88
- Joined: Tue Jan 13, 2004 3:07 pm
-
- Participant
- Posts: 75
- Joined: Tue May 13, 2003 4:14 am
- Location: California
- Contact:
Hi,
Check the maximum file size in your system . As you are using unix system check with ulimit -a command. If the maximum permissible file size is less than your required file size then increase the size.
If the file size is within the permissible size then add a resource disk in your datastage configuration file.
I think this will work..
Regards
Amitava
Check the maximum file size in your system . As you are using unix system check with ulimit -a command. If the maximum permissible file size is less than your required file size then increase the size.
If the file size is within the permissible size then add a resource disk in your datastage configuration file.
I think this will work..
Regards
Amitava
-
- Charter Member
- Posts: 88
- Joined: Tue Jan 13, 2004 3:07 pm