ds_seqput: error in 'write()' - Error 0

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
maheshsada
Participant
Posts: 69
Joined: Tue Jan 18, 2005 12:15 am

ds_seqput: error in 'write()' - Error 0

Post by maheshsada »

Hi

Iam getting the following error

ds_seqput: error in 'write()' - Error 0

I have checked forum, the explanations were given for DS version 6 and 7, is there any restrictions in the .dat file generation.

In my job

Iam selecting data from table and generating a dat file

I have even checked using ulimit -a in the before job_subroutine the output is

Testjob..BeforeJob (ExecSH): Executed command: "ulimit -a"
*** Output from command was: ***
time(seconds) unlimited
file(blocks) unlimited
data(kbytes) unlimited
stack(kbytes) 8192
coredump(blocks) unlimited
nofiles(descriptors) 1024
memory(kbytes) unlimited

In the OS also iam getting the same values. Can you please tell whether the restriction is due to DS or OS file system

regards
Magesh S
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Version shouldn't matter, the error is the same for 5, 6, 7... it should mean you ran out of space where you were writing the file. Is that not the case?
-craig

"You can never have too many knives" -- Logan Nine Fingers
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

There can be a number of reasons for a sequential write to fail, but I don't think that the error code of "0" is correct or is going to help.

Can you please try redirecting the output .dat to write to /tmp. Does it work? If yes, then most likely you have a permissions or a filespace issue in the original location.

Does this error occur on writing the first line or at some point later on? If later on, how large is the file - it could be at a 2Gb limit imposed by the file system.
maheshsada
Participant
Posts: 69
Joined: Tue Jan 18, 2005 12:15 am

Post by maheshsada »

Hi

The error is coming when the file size reaches 2GB. I have checked the directory it has enough free space(around 5GB) and the job has been running successfully for months.

I have also checked the file size generated in previous runs which is well below 2GB

So this may be the restriction imposed byOS for file size

Is my understanding right

regards

magesh S
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Check with your UNIX administrator. Some UNIX operating systems must have large file support explicitly switched on. This may not have occurred, which would explain the 2GB limit.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
maheshsada
Participant
Posts: 69
Joined: Tue Jan 18, 2005 12:15 am

Post by maheshsada »

Hi

Thank you all, the error is due to file size limit in OS, Once Unix admin, changed the setting the job has created a file which is more than 2GB

regards

Magesh S
Post Reply