Job do not progress after certain percent...
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 75
- Joined: Thu Nov 27, 2008 10:12 am
Job do not progress after certain percent...
Initially I searched this forum for a long time and tried all possible suggestions. However, the job is in running state for a very long time. Process percent in log is not proceeding after 40 percent. No error/warning message is found in the log file.
Its a simple job that involves two stages only.
Source:DB2 stage and target is also DB2 stage. I tried writing the source DB2 stage into a Data Set it works fine. But DB2 to DB2 the data is not getting loaded. The log throws no error message. The job link is in blue/green color. I can see the rows passing (initially 102 rows, 198 rows...) but the target table has no data.
I increased the transaction size and array size slowly from 1 till 40000. Can you please share your ideas on this issue.
Many Thanks
Its a simple job that involves two stages only.
Source:DB2 stage and target is also DB2 stage. I tried writing the source DB2 stage into a Data Set it works fine. But DB2 to DB2 the data is not getting loaded. The log throws no error message. The job link is in blue/green color. I can see the rows passing (initially 102 rows, 198 rows...) but the target table has no data.
I increased the transaction size and array size slowly from 1 till 40000. Can you please share your ideas on this issue.
Many Thanks
Re: Job do not progress after certain percent...
Hi,ds_search2008 wrote: Its a simple job that involves two stages only.
Source:DB2 stage and target is also DB2 stage.
I understand that datastage is not doing any validation / transformation of the records here. You are picking records from one DB2 table and inserting / updating into another DB2 table.
I think writing a DB procedure will be useful in this case.
-
- Participant
- Posts: 75
- Joined: Thu Nov 27, 2008 10:12 am
Thanks both of you for your replies.
v2kmadhav
I tried this possibility also. However the job is in running state only. Please help.
v2kmadhav
Code: Select all
does it work any better using an active stage in between the two read and write processes
which stage are you using? API? when you set the array size to 1.... u still dont see records in your table?? you said it passed 102 records??
can you try using the EE stage to see if makes a difference...
how big is that target table??
i hope u are not reading and writing to the same table :D
all the best...
can you try using the EE stage to see if makes a difference...
how big is that target table??
i hope u are not reading and writing to the same table :D
all the best...
-
- Participant
- Posts: 75
- Joined: Thu Nov 27, 2008 10:12 am
-
- Participant
- Posts: 75
- Joined: Thu Nov 27, 2008 10:12 am
-
- Participant
- Posts: 75
- Joined: Thu Nov 27, 2008 10:12 am
Thanks for the replies.
There is no trigger being set in any of the tables. Even the target is made very simple without any keys or not null columns for test purpose. We still face this issue.
Transaction size =1; Array Size=1.
Sima sorry if I understood your question wrong.
There is no trigger being set in any of the tables. Even the target is made very simple without any keys or not null columns for test purpose. We still face this issue.
Code: Select all
What is your transaction isolation setting in the API stage?
Sima sorry if I understood your question wrong.
How many inserts, how many records do you have in your target table that are going to be updated....
Did this job run faster ever before? Has there been any considerable change to datastage, the table or the database?? im sure you have looked at indexes too...
Did you try using EE stage?
Are that target table and source table on the same database..if different how are they cataloged on your machine?
if your target table is on a remote machine.... what is the bandwidth of that line?
Did this job run faster ever before? Has there been any considerable change to datastage, the table or the database?? im sure you have looked at indexes too...
Did you try using EE stage?
Are that target table and source table on the same database..if different how are they cataloged on your machine?
if your target table is on a remote machine.... what is the bandwidth of that line?
-
- Participant
- Posts: 75
- Joined: Thu Nov 27, 2008 10:12 am
Thanks v2kmadhav.
The records are around 2000 (we have started with test data to check the load). I tried loading field by field I found a bit improvement meaning the progress percentage got increased to 90%. However, the data is not getting loaded at the backend. I changed the transaction size and array size starting from 1. There are no indexes set on target table.
Yes we have tried EE. We face the same issue.
To fix this issue we now use sequential file stage as source instead of table. So there shouldn't be any issue. I tried adding an active stage between source and target. Even that is not working.
I have no idea on this issue. Nothing is positive about this job.
The records are around 2000 (we have started with test data to check the load). I tried loading field by field I found a bit improvement meaning the progress percentage got increased to 90%. However, the data is not getting loaded at the backend. I changed the transaction size and array size starting from 1. There are no indexes set on target table.
Yes we have tried EE. We face the same issue.
To fix this issue we now use sequential file stage as source instead of table. So there shouldn't be any issue. I tried adding an active stage between source and target. Even that is not working.
I have no idea on this issue. Nothing is positive about this job.