A fileset is created by a parallel job with 12 writes running in 2 nodes, so 24 files are created. When another job reads from the same fileset, it does not read all 24 files, instead it seems it just reads one of them. Both write and read jobs have the same pathname of the control file for the file set, such as, ErrDW. What's wrong there? Why does read from fileset not read all 24 files?
Thanks,
Read from a file set
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
what is meant by with 12 writes?
The job writes to 12 filesets with the same control file pathname and setting.
This job tries to test whether fileset has the capacity of supporting multiple writes in one job or from different jobs like hash file in server edition. It seems fileset does not have the same capability.
Thanks,
The job writes to 12 filesets with the same control file pathname and setting.
This job tries to test whether fileset has the capacity of supporting multiple writes in one job or from different jobs like hash file in server edition. It seems fileset does not have the same capability.
Thanks,
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Yes and no. It is not the File Set itself that imposes this limit; it's the underlying file sytem. Each of the segment files that make up the File Set can only have one writer at a time.
The same is true for Data Sets, Sequential Files, and all other structures that rely directly upon files in the file system.
The same is true for Data Sets, Sequential Files, and all other structures that rely directly upon files in the file system.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.