Do we need to worry about the warnings?

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
pavankvk
Participant
Posts: 202
Joined: Thu Dec 04, 2003 7:54 am

Do we need to worry about the warnings?

Post by pavankvk »

Do we really need to worry about the warnings that appear in the log..my jobs have many numberous warnings,but i never fixed them..it always performs what it is supposed to do..no issues..

let me know if getting rid of warnings is really important..
vmcburney
Participant
Posts: 3593
Joined: Thu Jan 23, 2003 5:25 pm
Location: Australia, Melbourne
Contact:

Post by vmcburney »

I would be a bit worried about partitioning warnings, in case your partitioning meant missed lookups or aggregation rows. I would also be concerned about database upserts/inserts/updates that did not have reject links because when rows get rejected you either get a warning message you ignore or no warning at all. I would investigate each warning and move each safe warning into a message handler until you have no warnings left. Then you will notice new warnings that represent a real problem.

There are good coding practices that reduce and even eliminate warnings, especially modify stages that convert metadata, but that's just a matter of trial and error.
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Pavankvk,

I believe that no production job should generate warnings. So in my opinion it is very important to attempt to program even PX jobs so that no warnings are generated. Sometimes specific warnings are not possible to get rid of in PX jobs, so I will take that warning message and demote it to informational using the message handlers.

Quite often some new problem will occur in a production environment that will generate a warning into the log file. If this real warning is only part of 2000 others it will never get noticed and the job won't run correctly any more and nobody will know about it!
vmcburney
Participant
Posts: 3593
Joined: Thu Jan 23, 2003 5:25 pm
Location: Australia, Melbourne
Contact:

Post by vmcburney »

Also be aware that occassionally a PX job will issue a megawarning and suppress further instances of that warning. So instead of seeing 100,000 warning messages for a row being rejected due to poor null handling you see just one warning and it tells you further warnings have been suppressed. That one warning gets lost in the noise and you are unaware your data is corrupt.

No warnings in production. None. And throw in some row count auditing while you are at it to make sure your input rows match your output rows and the rows sent to your target tables are actually getting there.
kumar_s
Charter Member
Charter Member
Posts: 5245
Joined: Thu Jun 16, 2005 11:00 pm

Post by kumar_s »

Hi DSXians,
Can you share the type and range of warnings which will be be added into message hadler in your server before and after the development phase.
I took pain to reduce the warnings to zero. Some time the resource and the time spent on remove the warnings makes me to think about the compromise it in message hadler.
But still i manage to add only 2 warnig codes in message hadler.

-Kumar
Post Reply