Dear All,
Job Description:
Dataset --> cpy---> Oracle stage
Method: Load Append
Partition Type: Same
Index mode: Rebuild
Node: 4 nodes
Disk space: 30 Gb
While the amount of data is around 593708 - This job run fine
While the amount of data is 3742694 - the job aborts with the following message
Error Message:
CPY,0: Failure during execution of operator logic.
CPY,0: Fatal Error: I/O subsystem: partition 0 must be a multiple of 131072 in size (was 9943826432). The partition was evidently corrupted.
The partition was evidently corrupted
Moderators: chulett, rschirm, roy
-
- Premium Member
- Posts: 54
- Joined: Thu Oct 18, 2007 4:20 am
- Location: Chennai
The partition was evidently corrupted
Regards
LakshmiNarayanan
LakshmiNarayanan
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
-
- Premium Member
- Posts: 54
- Joined: Thu Oct 18, 2007 4:20 am
- Location: Chennai
Thanks Chulett
You are absolutely correct, However I stepped into a new thing in DataStage.
My job sequence ran for around 1 hours and 30 min for approximately 3742694 records.
I was told to remove the length of all varchar columns in the target dataset in order to reduce the size of dataset created.
For example design
oracle stage --> copy ---> dataset
Imagine if Oracle stage contains 11 columns of datatype varchar(50). I mean that each column has a datatype varchar(50).
while mapping it to the target dataset, i removed all the length of varchar columns. so what happened was really amazing.
The size of dataset was really low and the job ran faster.
When I changed all the jobs in the job sequence, the job ran for 30 min for 3742694 records and it occupied only 3 gb of space while before it consumed 30 gb ....
You are absolutely correct, However I stepped into a new thing in DataStage.
My job sequence ran for around 1 hours and 30 min for approximately 3742694 records.
I was told to remove the length of all varchar columns in the target dataset in order to reduce the size of dataset created.
For example design
oracle stage --> copy ---> dataset
Imagine if Oracle stage contains 11 columns of datatype varchar(50). I mean that each column has a datatype varchar(50).
while mapping it to the target dataset, i removed all the length of varchar columns. so what happened was really amazing.
The size of dataset was really low and the job ran faster.
When I changed all the jobs in the job sequence, the job ran for 30 min for 3742694 records and it occupied only 3 gb of space while before it consumed 30 gb ....
Regards
LakshmiNarayanan
LakshmiNarayanan
-
- Premium Member
- Posts: 54
- Joined: Thu Oct 18, 2007 4:20 am
- Location: Chennai