INPUT_TO_AGGR: ds_ipcput() - timeout waiting for mutex

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

shivan
Participant
Posts: 70
Joined: Mon Jul 25, 2005 9:29 am

INPUT_TO_AGGR: ds_ipcput() - timeout waiting for mutex

Post by shivan »

Hi,
I am getting this error.
.INPUT_TO_AGGR: ds_ipcput() - timeout waiting for mutex
I really appreciate if someone telles me how to solve this problem.

thanks
shivan
kumar_s
Charter Member
Charter Member
Posts: 5245
Joined: Thu Jun 16, 2005 11:00 pm

Re: INPUT_TO_AGGR: ds_ipcput() - timeout waiting for mutex

Post by kumar_s »

shivan wrote:Hi,
I am getting this error.
.INPUT_TO_AGGR: ds_ipcput() - timeout waiting for mutex
I really appreciate if someone telles me how to solve this problem.

thanks
shivan
Hi shivan,
tell more about the issue, excatly where u get this error, or what r trying to do.
I could understand the error is to the aggregator stage,
but..............

regards
kumar
shivan
Participant
Posts: 70
Joined: Mon Jul 25, 2005 9:29 am

Re: INPUT_TO_AGGR: ds_ipcput() - timeout waiting for mutex

Post by shivan »

Hi,
thanks for replying. I am trying to get the data from db2 and putting into mainframe. The number of records are more than 1 million. It works fine while going from db2 to transformer and then to aggregate processor. But at the time of writing to mainframe. It gives an error. I was thinking of something has to do with array size which is 1 right now. Or transaction handling which i set it up to 250.
I hope this helps u in understanding my problem
thanks
shivan
kumar_s wrote:
shivan wrote:Hi,
I am getting this error.
.INPUT_TO_AGGR: ds_ipcput() - timeout waiting for mutex
I really appreciate if someone telles me how to solve this problem.

thanks
shivan
Hi shivan,
tell more about the issue, excatly where u get this error, or what r trying to do.
I could understand the error is to the aggregator stage,
but..............

regards
kumar
Sainath.Srinivasan
Participant
Posts: 3337
Joined: Mon Jan 17, 2005 4:49 am
Location: United Kingdom

Post by Sainath.Srinivasan »

Disable inter process comm and retry.
shivan
Participant
Posts: 70
Joined: Mon Jul 25, 2005 9:29 am

Post by shivan »

Inter process is already disabled. Settings is to default.

thanks
shivan
Sainath.Srinivasan wrote:Disable inter process comm and retry.
shivan
Participant
Posts: 70
Joined: Mon Jul 25, 2005 9:29 am

Post by shivan »

I run the job again. Now i got this error:
CLI0108E Communication link failure. SQLSTATE=40003

thanks
shivan wrote:Inter process is already disabled. Settings is to default.

thanks
shivan
Sainath.Srinivasan wrote:Disable inter process comm and retry.
kumar_s
Charter Member
Charter Member
Posts: 5245
Joined: Thu Jun 16, 2005 11:00 pm

Post by kumar_s »

hi shivan,

Looks like you connection timed out, I believe your source is returning
more data and it takes time read all the rows from the source by the
time the target is disconnected because of it reached the idle time.
Please pick up fewer amounts of data and test them and talk to your DBA
about the connection idle time.

regards
kumar
shivan
Participant
Posts: 70
Joined: Mon Jul 25, 2005 9:29 am

Post by shivan »

thanks Kumar, i will check with a DBA. the other problem i have is performance , the number of rows are like 800 per sec. Is there any way i can make them more.

thanks
shivan
kumar_s wrote:hi shivan,

Looks like you connection timed out, I believe your source is returning
more data and it takes time read all the rows from the source by the
time the target is disconnected because of it reached the idle time.
Please pick up fewer amounts of data and test them and talk to your DBA
about the connection idle time.

regards
kumar
vinaymanchinila
Premium Member
Premium Member
Posts: 353
Joined: Wed Apr 06, 2005 8:45 am

Post by vinaymanchinila »

Hi Shivan,
Use a SORT stage before the aggregator and sort the columns. Also make sure you mention the sort order in the Aggregator.

If you can split the query into smaller time frames thats great, else try to extract into a flat file and then load it in another job, which is not a great idea as you will be landing the data which is not required but it would be easy for you to test it , instead of querying it everytime it fails!
shivan
Participant
Posts: 70
Joined: Mon Jul 25, 2005 9:29 am

Post by shivan »

hi,
i am getting this error :
Abnormal termination of stage CopyOfDB2_INVOICE_LINE_ITEM_RPT_part..Aggregator3 detected
I am using link_partitioner and partioner_collector.

thanks
shivan
vinaymanchinila wrote:Hi Shivan,
Use a SORT stage before the aggregator and sort the columns. Also make sure you mention the sort order in the Aggregator.

If you can split the query into smaller time frames thats great, else try to extract into a flat file and then load it in another job, which is not a great idea as you will be landing the data which is not required but it would be easy for you to test it , instead of querying it everytime it fails!
vinaymanchinila
Premium Member
Premium Member
Posts: 353
Joined: Wed Apr 06, 2005 8:45 am

Post by vinaymanchinila »

Hi,
You mean you are using them before and after the aggregator?
And are you sorting the data before passing to aggreVator!
shivan wrote:hi,
i am getting this error :
Abnormal termination of stage CopyOfDB2_INVOICE_LINE_ITEM_RPT_part..Aggregator3 detected
I am using link_partitioner and partioner_collector.

thanks
shivan
vinaymanchinila wrote:Hi Shivan,
Use a SORT stage before the aggregator and sort the columns. Also make sure you mention the sort order in the Aggregator.

If you can split the query into smaller time frames thats great, else try to extract into a flat file and then load it in another job, which is not a great idea as you will be landing the data which is not required but it would be easy for you to test it , instead of querying it everytime it fails!
shivan
Participant
Posts: 70
Joined: Mon Jul 25, 2005 9:29 am

Post by shivan »

no, i am not. If this is what i m doing wrong. I will do that. I thought that sort is to increase the performance.

thanks
shivan
vinaymanchinila wrote:Hi,
You mean you are using them before and after the aggregator?
And are you sorting the data before passing to aggreVator!
shivan wrote:hi,
i am getting this error :
Abnormal termination of stage CopyOfDB2_INVOICE_LINE_ITEM_RPT_part..Aggregator3 detected
I am using link_partitioner and partioner_collector.

thanks
shivan
vinaymanchinila wrote:Hi Shivan,
Use a SORT stage before the aggregator and sort the columns. Also make sure you mention the sort order in the Aggregator.

If you can split the query into smaller time frames thats great, else try to extract into a flat file and then load it in another job, which is not a great idea as you will be landing the data which is not required but it would be easy for you to test it , instead of querying it everytime it fails!
shivan
Participant
Posts: 70
Joined: Mon Jul 25, 2005 9:29 am

Post by shivan »

hi,
i m not using the sorting at all. My job looks like this:
db2 then a partitioner then three transformers and then three aggregators and then link collector and then mainframe database. Now i m getting this error:
ds_ipcgetnext - timeout waiting for mutex
There are like 1.7 million records. Can i run three jobs based on different dates, which divides the data. I dont think so it is acceptable. But i am not sure.
Your help is really appreciated.
thanks
shivan
shivan wrote:no, i am not. If this is what i m doing wrong. I will do that. I thought that sort is to increase the performance.

thanks
shivan
vinaymanchinila wrote:Hi,
You mean you are using them before and after the aggregator?
And are you sorting the data before passing to aggreVator!
shivan wrote:hi,
i am getting this error :
Abnormal termination of stage CopyOfDB2_INVOICE_LINE_ITEM_RPT_part..Aggregator3 detected
I am using link_partitioner and partioner_collector.

thanks
shivan
pnchowdary
Participant
Posts: 232
Joined: Sat May 07, 2005 2:49 pm
Location: USA

Post by pnchowdary »

Hi Shivan,
I m not using the sorting at all.
Passing unsorted input data to an Aggregator, might work for a small number of rows, as Aggregator internally sorts it and then applies the aggregate functions to the data.

However, it might not work in case of a large number of rows. Hence it is highly recommended to sort the data in a sort stage and also make sure you mention the same sort key in the Aggregator stage.
There are like 1.7 million records. Can i run three jobs based on different dates, which divides the data.
Yes, you can run three jobs with the input to each job being a subset of the complete data, obtained by your date criteria.

Gurus, pls correct me if I am wrong in any of the above comments.
Thanks,
Naveen
vinaymanchinila
Premium Member
Premium Member
Posts: 353
Joined: Wed Apr 06, 2005 8:45 am

Post by vinaymanchinila »

I had the same issue with the aggregator when I did NOT use the Sort stage, and I had 2.5 million records.

I sorted the data and it now works fine!
Post Reply