Hi
Iam getting the following error
ds_seqput: error in 'write()' - Error 0
I have checked forum, the explanations were given for DS version 6 and 7, is there any restrictions in the .dat file generation.
In my job
Iam selecting data from table and generating a dat file
I have even checked using ulimit -a in the before job_subroutine the output is
Testjob..BeforeJob (ExecSH): Executed command: "ulimit -a"
*** Output from command was: ***
time(seconds) unlimited
file(blocks) unlimited
data(kbytes) unlimited
stack(kbytes) 8192
coredump(blocks) unlimited
nofiles(descriptors) 1024
memory(kbytes) unlimited
In the OS also iam getting the same values. Can you please tell whether the restriction is due to DS or OS file system
regards
Magesh S
ds_seqput: error in 'write()' - Error 0
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 69
- Joined: Tue Jan 18, 2005 12:15 am
There can be a number of reasons for a sequential write to fail, but I don't think that the error code of "0" is correct or is going to help.
Can you please try redirecting the output .dat to write to /tmp. Does it work? If yes, then most likely you have a permissions or a filespace issue in the original location.
Does this error occur on writing the first line or at some point later on? If later on, how large is the file - it could be at a 2Gb limit imposed by the file system.
Can you please try redirecting the output .dat to write to /tmp. Does it work? If yes, then most likely you have a permissions or a filespace issue in the original location.
Does this error occur on writing the first line or at some point later on? If later on, how large is the file - it could be at a 2Gb limit imposed by the file system.
<a href=http://www.worldcommunitygrid.org/team/ ... TZ9H4CGVP1 target="WCGWin">
</a>
</a>
-
- Participant
- Posts: 69
- Joined: Tue Jan 18, 2005 12:15 am
Hi
The error is coming when the file size reaches 2GB. I have checked the directory it has enough free space(around 5GB) and the job has been running successfully for months.
I have also checked the file size generated in previous runs which is well below 2GB
So this may be the restriction imposed byOS for file size
Is my understanding right
regards
magesh S
The error is coming when the file size reaches 2GB. I have checked the directory it has enough free space(around 5GB) and the job has been running successfully for months.
I have also checked the file size generated in previous runs which is well below 2GB
So this may be the restriction imposed byOS for file size
Is my understanding right
regards
magesh S
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Check with your UNIX administrator. Some UNIX operating systems must have large file support explicitly switched on. This may not have occurred, which would explain the 2GB limit.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Participant
- Posts: 69
- Joined: Tue Jan 18, 2005 12:15 am