Terrible Performance due to log file

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

kwwilliams
Participant
Posts: 437
Joined: Fri Oct 21, 2005 10:00 pm

Post by kwwilliams »

jamesrender wrote:
chulett wrote:
I think that it looks like a really useful tool
It is a very useful tool, but you need training to be able to use it properly. Hopefully they will not pull the tool because they have not given proper training, that would be the result with any tool, not just DataStage.
jamesrender
Participant
Posts: 13
Joined: Fri Jan 06, 2006 9:20 am

Post by jamesrender »

kwwilliams wrote:
jamesrender wrote:
It is a very useful tool, but you need training to be able to use it properly. Hopefully they will not pull the tool because they have not given proper training, that would be the result with any tool, not just DataStage.
bit of a chicken and egg situation, I can't convince them to keep it without being proficient with it, and I can't get that proficiency if it goes in the next month.. their feeling is that its an extra level of complexity for the application, will probably want some sql/loader process created, even though that is what DS is all about. any opportunity to save a few bucks by dropping a license..

I'm sure that I'll discover a thorny issue that datastage solved once the decision to drop it has been taken..
Last edited by jamesrender on Mon Jan 16, 2006 10:33 am, edited 1 time in total.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

kwwilliams wrote:
jamesrender wrote:
chulett wrote:
I think that it looks like a really useful tool
It is a very useful tool, but you need training to be able to use it properly. Hopefully they will not pull the tool because they have not given proper training, that would be the result with any tool, not just DataStage.
Be careful with your quoting... that wasn't me.
-craig

"You can never have too many knives" -- Logan Nine Fingers
srinagesh
Participant
Posts: 125
Joined: Mon Jul 25, 2005 7:03 am

Post by srinagesh »

try altering the table by giving "nologging" option

Alter table Emp NoLogging;
(where emp is the name of the table)

This will definetly speed up the processing and will also avoid writing to the Oracle Logs.


HTH
Nagesh
jamesrender
Participant
Posts: 13
Joined: Fri Jan 06, 2006 9:20 am

Post by jamesrender »

Having just had a quick scan in my Oracle book, I don't believe that nologging will offer any benefit.
Viswanath
Participant
Posts: 68
Joined: Tue Jul 08, 2003 10:46 pm

Post by Viswanath »

Hi,

Couldnt help asking. Ray you said that keepipng logs of DS for a long time without purge would degrade performance? We keep getting requets of data load dating back to more then 12 months, due to which we dont delete any logs. Would purging them help in performance improvement? Also is there a way where I can store these log files? Outside of DS I mean so that I can do some archiving?

Cheers,
Vishy
jamesrender
Participant
Posts: 13
Joined: Fri Jan 06, 2006 9:20 am

Post by jamesrender »

Viswanath wrote:Hi,

Couldnt help asking. Ray you said that keepipng logs of DS for a long time without purge would degrade performance? We keep getting requets of data load dating back to more then 12 months, due to which we dont delete any logs. Would purging them help in performance improvement? Also is there a way where I can store these log files? Outside of DS I mean so that I can do some archiving?

Cheers,
Vishy
I can't speak with any authority, but certainly in my case, the size of the log had a direct impact on job performance. It is possible to save DS logs to a file system. I have done it once, but I can't remember how.. sorry.
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

The logs degrade performance only when the job has to put a message into its log file. Since the log file is actually a dynamic hash file, an extremely large hash file may take longer to add a row, especially if the file just happens to need to dynamically expand at that point in time. If the job is logging a lot of messages, then yes, absolutely you will notice a severe impact to runtime performance.

At the start and end of processing, the job logs informational messages. There is an impact to total runtime, so keeping the logs purged allows startup and wrapup processing to be minimal. You will notice two reasons a job finishes moving data, yet still seems to be doing something else before finishing: logging/purging messages and updating the &PH& information. For faster startup/wrapup, purge your log files and keep the &PH& directory in the project clean.
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Viswanath
Participant
Posts: 68
Joined: Tue Jul 08, 2003 10:46 pm

Post by Viswanath »

Thanks Kenneth.

I guess I have some work to do then. We have around 300 odd jobs in our Production project. I have to do a manual purge now and the enable auto purge for these.

But is there a way by which I can automatically store these logs in a file? I probably would need them in future for auditing purposes.

Cheers,
Vishy
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Yes. Do an all terms search for archive and log. This is one of the posts you will find.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Viswanath
Participant
Posts: 68
Joined: Tue Jul 08, 2003 10:46 pm

Post by Viswanath »

Thanks Ray.

Cheers,
Vishy
Post Reply