Page 1 of 1

Help Needed:Writing to Fileset gives error

Posted: Sat Mar 20, 2004 3:28 am
by Inquisitive
I have a job which reads from a sequential file and writes to a fileset.
This job runs fine when my input sequential file has 38 million records ( 3 GB in size)

However when my input file grows big to 140 million ( 11 GB) in size, my job gives the following error.

File_Set_11,0: Bad read in APT_FileBufferOutput::spillToNextFile(): Bad file numberI

Any inputs for resolving this would be appreciated

Thanks,

Posted: Sat Mar 20, 2004 9:14 am
by gh_amitava
Hi,

Check the maximum file size in your system . As you are using unix system check with ulimit -a command. If the maximum permissible file size is less than your required file size then increase the size.

If the file size is within the permissible size then add a resource disk in your datastage configuration file.

I think this will work..

Regards
Amitava

Posted: Sat Mar 20, 2004 9:34 pm
by Inquisitive
Hi,

As mentioned in my earlier mail, my input sequential file is over 11 GB in size and ulimit is unlimited.

I have used both 4 node and 8 node configuration fie.

$ ulimit -a
file(blocks) unlimited

Any other inputs ?