Page 1 of 1

Multiple job has to write to a single file

Posted: Tue Nov 12, 2013 12:27 pm
by Maximus_Jack
Hi
I have a requirement where multiple instance of a job has to write to a single file, i tried with both sequential file and dataset,

with sequential file- Somehow the records are missing when the job finishes..

with dataset - Sometimes I'm getting an error as "Error updating
ORCHESTRATE File Dataset descriptor for" this, sometimes i dont even get an error and the records wont be available in the file

is there any other way where multiple jobs write to a single file

Datastage version : 8.7 Fix pack 1

MJ

Posted: Tue Nov 12, 2013 12:49 pm
by chulett
Simultaneously? No.

Posted: Tue Nov 12, 2013 1:02 pm
by Maximus_Jack
thanks chullet..


strange... is there no way do it.. ?

Posted: Tue Nov 12, 2013 1:11 pm
by chulett
It's not strange, it's the nature of sequential media which supports multiple readers but only a single writer. You'd need to sequence your jobs so they append their output.

Posted: Tue Nov 12, 2013 1:25 pm
by Maximus_Jack
thanks...

but it should work for dataset isn't ?

Posted: Tue Nov 12, 2013 2:53 pm
by ray.wurlod
No, because each segment file of a Data Set is a sequential file (same argument as before) and each parallel job has every segment file in the Data Set in use.

Posted: Wed Nov 13, 2013 8:46 am
by Maximus_Jack
thanks ray and chullet....