Help Needed:Writing to Fileset gives error

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
Inquisitive
Charter Member
Charter Member
Posts: 88
Joined: Tue Jan 13, 2004 3:07 pm

Help Needed:Writing to Fileset gives error

Post by Inquisitive »

I have a job which reads from a sequential file and writes to a fileset.
This job runs fine when my input sequential file has 38 million records ( 3 GB in size)

However when my input file grows big to 140 million ( 11 GB) in size, my job gives the following error.

File_Set_11,0: Bad read in APT_FileBufferOutput::spillToNextFile(): Bad file numberI

Any inputs for resolving this would be appreciated

Thanks,
gh_amitava
Participant
Posts: 75
Joined: Tue May 13, 2003 4:14 am
Location: California
Contact:

Post by gh_amitava »

Hi,

Check the maximum file size in your system . As you are using unix system check with ulimit -a command. If the maximum permissible file size is less than your required file size then increase the size.

If the file size is within the permissible size then add a resource disk in your datastage configuration file.

I think this will work..

Regards
Amitava
Inquisitive
Charter Member
Charter Member
Posts: 88
Joined: Tue Jan 13, 2004 3:07 pm

Post by Inquisitive »

Hi,

As mentioned in my earlier mail, my input sequential file is over 11 GB in size and ulimit is unlimited.

I have used both 4 node and 8 node configuration fie.

$ ulimit -a
file(blocks) unlimited

Any other inputs ?
Post Reply