Number of warning crossed 50 while reading from a sequential
Moderators: chulett, rschirm, roy
Number of warning crossed 50 while reading from a sequential
While reading data from a fixed width sequential file, have got warnings and crossed 50 and the job got aborted.
The next stage to sequential file is a transformer stage and also there is a reject link on the source sequential file stage.
The flow of execution is as below
A script executes the command dsjob to initiate a sequence and the sequence in turn calls the above job(which is getting aborted due to warnings limit).
Even though I have called this script from datastage director by setting the warning to No limt...The job still fails.
I have come across some blogs that to avoid this warning limit(say 200), dsjob to be used with warn parameter or by demoting the warnings.
But this may be counter productive when the warning limit crosses he defined.
* The source file is out of my control for modification
* When there is a reject link on the source file(fixed width) isn't it not sufficient to handle this?
Please help me with your thoughts how to overcome this problem
The next stage to sequential file is a transformer stage and also there is a reject link on the source sequential file stage.
The flow of execution is as below
A script executes the command dsjob to initiate a sequence and the sequence in turn calls the above job(which is getting aborted due to warnings limit).
Even though I have called this script from datastage director by setting the warning to No limt...The job still fails.
I have come across some blogs that to avoid this warning limit(say 200), dsjob to be used with warn parameter or by demoting the warnings.
But this may be counter productive when the warning limit crosses he defined.
* The source file is out of my control for modification
* When there is a reject link on the source file(fixed width) isn't it not sufficient to handle this?
Please help me with your thoughts how to overcome this problem
Thanks,
HK
*Go GREEN..Save Earth*
HK
*Go GREEN..Save Earth*
You can't "call a script" from the Director, so a little lost there. When the script runs the job, why not use the -warn option there to change the number of warnings from 50? You can use -warn 0 for unlimited or something higher than 50 to increase it.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
Hey Craig,
To clarify..when I say running a script from director, I have created a sequence job which has Execute_Command wherein I mention the script.
Yes I can use warn(0) parameters along with the dsjob command which may not be a good practice to adopt.
The topic which I am confused is "When there is a reject file from the sequential file stage does it not ignore the warnings and continue the process"
To clarify..when I say running a script from director, I have created a sequence job which has Execute_Command wherein I mention the script.
Yes I can use warn(0) parameters along with the dsjob command which may not be a good practice to adopt.
The topic which I am confused is "When there is a reject file from the sequential file stage does it not ignore the warnings and continue the process"
Thanks,
HK
*Go GREEN..Save Earth*
HK
*Go GREEN..Save Earth*
I agree that setting warnings to unlimited is not a good practice to adopt as a general statement but on a case by case basis? Perhaps.
Sorry, can't really answer the reject link question other than to say that if it generates warnings there's no way for the logging mechanism to ignore them based on where they come from - it's an all or nothing affair. AFAIK, you either need to suppress the warnings or adjust (or eliminate) the warning threshold accordingly.
Others may have more better information for you, though. Let's see!
Sorry, can't really answer the reject link question other than to say that if it generates warnings there's no way for the logging mechanism to ignore them based on where they come from - it's an all or nothing affair. AFAIK, you either need to suppress the warnings or adjust (or eliminate) the warning threshold accordingly.
Others may have more better information for you, though. Let's see!
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Better practice is to identify the causes of the warnings and to try to eliminate them.
When you run a job from Director or Designer the Job Run Options dialog allows you to set a non-default limit on the number of warnings.
When you run a job from Director or Designer the Job Run Options dialog allows you to set a non-default limit on the number of warnings.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
<grammar nazi>
It would be best if there were three or more options. Here, however, there were only two options (to fix, or not to fix, that is the question), so the comparative adjective "better" should be used.
</grammar nazi>
It would be best if there were three or more options. Here, however, there were only two options (to fix, or not to fix, that is the question), so the comparative adjective "better" should be used.
</grammar nazi>
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Having a reject file from the Sequential File stage is good in that it will let you examine which records were rejected.
Having warnings when records are rejected is good in that it serves as your alert. It is alerting someone to wake up and examine why records are rejected.
One use for that is to take the information back to the source of the records and let them know they are sending you junk. If they don't care and aren't going to fix the problem, then you could demote the warnings to informational status.
So, it seems like you have three or more options:
1) do nothing
2) have the source system fix the records before creating the file(s)
3) create a preprocessing job to tidy up the files yourself before the existing job runs
4) use message handler to make warnings informational
5) use the dsjob -run -warn 0 option
One of the choices is likely "more better" than the other...
Having warnings when records are rejected is good in that it serves as your alert. It is alerting someone to wake up and examine why records are rejected.
One use for that is to take the information back to the source of the records and let them know they are sending you junk. If they don't care and aren't going to fix the problem, then you could demote the warnings to informational status.
So, it seems like you have three or more options:
1) do nothing
2) have the source system fix the records before creating the file(s)
3) create a preprocessing job to tidy up the files yourself before the existing job runs
4) use message handler to make warnings informational
5) use the dsjob -run -warn 0 option
One of the choices is likely "more better" than the other...
Choose a job you love, and you will never have to work a day in your life. - Confucius
-
- Premium Member
- Posts: 1735
- Joined: Thu Mar 01, 2007 5:44 am
- Location: Troy, MI
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact: