Hi,
I have created a job to check how good the source file is.
For every import error there will be a warning.
We read the logs and identify the columns which have junk data in them.
Now if the source file has 6 million records and worst case all of them have junk data the director log is going to be massive.
wanted to do an impact analysis to see how this would impact datastage sever.
Regards,
Samyam
Space Occupied by datastage logs
Moderators: chulett, rschirm, roy
-
- Premium Member
- Posts: 258
- Joined: Tue Jul 04, 2006 10:35 pm
- Location: Toronto
pardon me... yuk.
Don't inject a warning message in your job execution log which valdiates data quality of your input data. I would rather create a seperate file which contains the dirty records and one that contains valid records.
You could have a COUNT value echoed to your job execution log if you wish.
Always plan for worst case, and that would be that you have suddenly injected 6million rows of log file information into your UV database. You do not want to do that.
What do you plan on doing with the warning messages?
Did your site turn off the default 50 warnings and you abort setting?
Don't inject a warning message in your job execution log which valdiates data quality of your input data. I would rather create a seperate file which contains the dirty records and one that contains valid records.
You could have a COUNT value echoed to your job execution log if you wish.
Always plan for worst case, and that would be that you have suddenly injected 6million rows of log file information into your UV database. You do not want to do that.
What do you plan on doing with the warning messages?
Did your site turn off the default 50 warnings and you abort setting?
-
- Premium Member
- Posts: 258
- Joined: Tue Jul 04, 2006 10:35 pm
- Location: Toronto
-
- Premium Member
- Posts: 258
- Joined: Tue Jul 04, 2006 10:35 pm
- Location: Toronto