Page 1 of 1

Input/output error in Sequential File stage

Posted: Tue Oct 24, 2006 8:54 am
by durgaps
I am using a SEQ FILE Stage. I am getting the following error while I am writing the output to the sequential file.
ds_seqclose: error in 'close()' - Input/output error
Any idea why this is happening.

Re: Input/output error in Sequential File stage

Posted: Tue Oct 24, 2006 9:18 am
by durgaps
durgaps wrote:I am using a SEQ FILE Stage. I am getting the following error while I am writing the output to the sequential file.
ds_seqclose: error in 'close()' - Input/output error
Any idea why this is happening.
Can this be a space-related issue?

Posted: Tue Oct 24, 2006 9:43 am
by ray.wurlod
It might be that you lack write permission to the parent directory and so can not modify the date/time modified and date/time accessed for the file that you have been writing.

Posted: Tue Oct 24, 2006 8:07 pm
by durgaps
Thanks for the reply Ray. This is happening every 3-4 times I am running the code. And yes I am able to write the file without any issues and then all of a sudden this problem crops up. So there is no issues with the access permission to the directory.

Thanks,

Posted: Tue Oct 24, 2006 8:12 pm
by talk2shaanc
looking at the msg i wd make three guesses
1. permission on the file
2. space problem
3. u are trying to write a file twice in same job or ur trying to read a file and write to the same file, in the job

Posted: Wed Oct 25, 2006 7:27 am
by ray.wurlod
Let me ask this question; even when there is a problem reported with close() do the data all get written to the file?

Posted: Thu Oct 26, 2006 1:57 am
by kumar_s
Have you checked the avaialbility of space if you are more doubtfull in it?

Posted: Thu Oct 26, 2006 9:48 pm
by durgaps
Hi Ray/Kumar,

When there is a Close() error, the job aborts, hence the file doesnt get written.

There is enough space on the server, so there shouldnt be any space-related issue.

Thanks,

Posted: Thu Oct 26, 2006 11:25 pm
by Kirtikumar
As after the job abort it is clearing the partial file from the dir, it will be showing there is free space.

Just when the job is running keep an eye on space i.e. after each 2/3 secs, check the space available.

Posted: Mon Nov 13, 2006 8:56 pm
by durgaps
Hi all,

Thanks for the replies. Apparently the problem has ceased to occur now. The only difference being a soft mount was used for the server instead of a hard mount. After the hard mount was done the I/O problems has stopped. But the root-cause as to why the error was occuring is still unknown. May be the experts on this group should be able to shed some light on this issue.

Thanks again.

Posted: Mon Nov 13, 2006 10:50 pm
by chulett
durgaps wrote:The only difference being a soft mount was used for the server instead of a hard mount.
Hmm... when you say 'for the server' do you mean where you installed DataStage? If so, I was under the impression it wouldn't even allow something like that. :?

If not that, then curious what you really meant by that statement.

Posted: Tue Nov 14, 2006 7:11 am
by ray.wurlod
Could it be that you only had read permissions to the soft mount? Or that the mount was not even there when the failures occurred?