Page 1 of 1

Writing 64 Target files

Posted: Tue Aug 24, 2010 6:58 am
by Gokul
We have a requirement to write the data into 64 target files. The data is written into one of the 64 files based on value of a column. The target file is an ebcidic file.

Same is the design for 6 of the jobs. These leads to bottleneck with most of the execution time involved in write operations.

These hampers performance. Is there any environment variable which can be set to improve the write performance.

Posted: Tue Aug 24, 2010 7:56 am
by chulett
What O/S? Talk to your SysAdmins, have them monitor the I/O on the system. Perhaps you can move your target to faster drives or they can 'adjust' some filesystem journalling options, etc.

Posted: Tue Aug 24, 2010 8:14 am
by Sainath.Srinivasan
Why don't you write in a single file and then split based on some value / flag ?

Posted: Tue Aug 24, 2010 4:19 pm
by ray.wurlod
Are all the files on the same file system? Do you have the flexibility to spread the files over multiple disks?