Error viewing log, Computed blink does not match expected
Moderators: chulett, rschirm, roy
Error viewing log, Computed blink does not match expected
Hello freinds,
I am getting the following error when I tried to view the log of an aborted job. Can someone please help me in resolving the issue?
Thanks a lot in advance
Yamini
Error:
Error selecting from log file RT_LOG1056
Command was: SSELECT RT_LOG1056 WITH @ID LIKE '1N0N' AND TIMESTAMP >= "2004-01-01 00:30:00" COUNT.SUP
Error was: Internal data error. File '/data/Ascential/DataStage/Projects/DATWHR/RT_LOG1056/DATA.30': Computed blink of 0x8EC does not match expected blink of 0x0! Detected within group starting at address 0x80000000!
I am getting the following error when I tried to view the log of an aborted job. Can someone please help me in resolving the issue?
Thanks a lot in advance
Yamini
Error:
Error selecting from log file RT_LOG1056
Command was: SSELECT RT_LOG1056 WITH @ID LIKE '1N0N' AND TIMESTAMP >= "2004-01-01 00:30:00" COUNT.SUP
Error was: Internal data error. File '/data/Ascential/DataStage/Projects/DATWHR/RT_LOG1056/DATA.30': Computed blink of 0x8EC does not match expected blink of 0x0! Detected within group starting at address 0x80000000!
Your log is corrupted, maybe because of it exceeded 2.2 GB. You can try to recover the contents, but the easiest thing to do is clear it out completely. From DS Admin, connect to that project and issue "CLEAR.FILE RT_LOG1056". You'll lose the purge settings for the job, but at least the log (and job) will be usable and runnable.
Kenneth Bland
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
You corrupted the log file, probably by extending it past the 2GB limit. At this point about all you can do is clear it by going to the Administrator and issuing a CLEAR.FILE against RT_LOG1056.
Rerun the job with a row limit or set it to abort after a small number of warnings so you can see (and address) the problem.
Rerun the job with a row limit or set it to abort after a small number of warnings so you can see (and address) the problem.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Strange error while trying to view log entries
Hello there,
You are right guys about the log crossing the limit. Actually the job was extracting millions of rows from a table and the data was corrupted. For every bad row extracted it was writing an entry into the log. I manually stopped the job and cleared the log
Thanks a lot for your input
Yamini
You are right guys about the log crossing the limit. Actually the job was extracting millions of rows from a table and the data was corrupted. For every bad row extracted it was writing an entry into the log. I manually stopped the job and cleared the log
Thanks a lot for your input
Yamini
Guys,
Same issue here. The log file for a job exceeded 2GB and on finding this, the developer manually removed the files (using rm command) under RT_LOGnnn folder. He should have used CLEAR.FILE command!! Now the job has been locked. I want to delete this job. Will appreciate if any of you could guide me.
Thanks!
Same issue here. The log file for a job exceeded 2GB and on finding this, the developer manually removed the files (using rm command) under RT_LOGnnn folder. He should have used CLEAR.FILE command!! Now the job has been locked. I want to delete this job. Will appreciate if any of you could guide me.
Thanks!
Nitin Jain | India
If everything seems to be going well, you have obviously overlooked something.
If everything seems to be going well, you have obviously overlooked something.
You could take an empty log and copy the files from there over to the directory where the files were removed. That should give you normal access to things again.
Locked is a whole 'nuther issue. Search the forum for UNLOCK for various discussions on handling locked jobs.
Locked is a whole 'nuther issue. Search the forum for UNLOCK for various discussions on handling locked jobs.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
Do you know how to recover the contents in the log file? As I need to investige the reason of the large amount of warnings in the log.chulett wrote:You corrupted the log file, probably by extending it past the 2GB limit. At this point about all you can do is clear it by going to the Administrator and issuing a CLEAR.FILE against RT_LOG1056.
Re ...
If you cleared it then its gone. Re-run the same job with warning limit set to maybe 2 or 3. This way you can view the log in the director and see why those warning messages were created. It will also keep the log file size in check.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
If you really are going to get millions of unavoidable warnings - though I can not see any good reason for such a situation - you can resize the log (once it has been cleared) to handle more than 2GB.
Code: Select all
RESIZE RT_LOG1056 * * * 64BIT
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.