Garbage data

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
shrey3a
Premium Member
Premium Member
Posts: 234
Joined: Sun Nov 21, 2004 10:41 pm

Garbage data

Post by shrey3a »

Hi,

We have a job which aborts due to garbage data trailing behind one or 2 columns ..as it appears in director log. But when we reset the job and run it ..it just go fine.

The only reason I see is IPC stage , when i remove IPC stage as we done in many other job it runs fine. Is it happening due to memory leak/ time out property , how we can avoid it without removing the IPC stage.

the data appaers to be like 1234#some funny chrs......

Thanks,
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

Did you try increasing the buffer size? How huge is your record?
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
shrey3a
Premium Member
Premium Member
Posts: 234
Joined: Sun Nov 21, 2004 10:41 pm

Post by shrey3a »

DSguru2B wrote:Did you try increasing the buffer size? How huge is your record?
its 128...how much we can increase is there any formula for it or its an hit and trial methiod.

Regards,
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

Thats by default. That number specifies a block of memory to hold the data. One block for writing to and the other is for reading. So if you put 128, that means 256 is specified. So I would say increase that. Let it be a multiple of the complete record size + a few bytes.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
shrey3a
Premium Member
Premium Member
Posts: 234
Joined: Sun Nov 21, 2004 10:41 pm

Post by shrey3a »

DSguru2B wrote:Did you try increasing the buffer size? How huge is your record?
Just checked its 512 .....will it make a difference on performance of jobs...as we have nearly 300 job batch cycle and more than 200 jobs might be using IPC stage...... if we make it to 1024 ...changing in env variable for ipc buffer

Thanks
shrey3a
Premium Member
Premium Member
Posts: 234
Joined: Sun Nov 21, 2004 10:41 pm

Post by shrey3a »

DSguru2B wrote:Thats by default. That number specifies a block of memory to hold the data. One block for writing to and the other is for reading. So if you put 128, that means 256 is specified. So I would say increase that. Let it be a multiple of the complete record size + a few bytes.
So I have seventeen fields with length total of 144 i.e. sum of length of all fields so u suggest i may increase it to 200 ......

Thanks
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

The buffer size is specified in KBytes, not in bytes.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply