Error viewing log, Computed blink does not match expected

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

yaminids
Premium Member
Premium Member
Posts: 387
Joined: Mon Oct 18, 2004 1:04 pm

Error viewing log, Computed blink does not match expected

Post by yaminids »

Hello freinds,

I am getting the following error when I tried to view the log of an aborted job. Can someone please help me in resolving the issue?

Thanks a lot in advance
Yamini

Error:
Error selecting from log file RT_LOG1056
Command was: SSELECT RT_LOG1056 WITH @ID LIKE '1N0N' AND TIMESTAMP >= "2004-01-01 00:30:00" COUNT.SUP
Error was: Internal data error. File '/data/Ascential/DataStage/Projects/DATWHR/RT_LOG1056/DATA.30': Computed blink of 0x8EC does not match expected blink of 0x0! Detected within group starting at address 0x80000000!
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

Your log is corrupted, maybe because of it exceeded 2.2 GB. You can try to recover the contents, but the easiest thing to do is clear it out completely. From DS Admin, connect to that project and issue "CLEAR.FILE RT_LOG1056". You'll lose the purge settings for the job, but at least the log (and job) will be usable and runnable.
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

You corrupted the log file, probably by extending it past the 2GB limit. At this point about all you can do is clear it by going to the Administrator and issuing a CLEAR.FILE against RT_LOG1056.

Rerun the job with a row limit or set it to abort after a small number of warnings so you can see (and address) the problem.
-craig

"You can never have too many knives" -- Logan Nine Fingers
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

You should purge your log more frequently. Set auto purge. Don't let it grow past 2GB. If it's going to anyway, resize it to use 64-bit pointers.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

I'm guessing it's one of those single runs with unlimited warnings kind of thing...
-craig

"You can never have too many knives" -- Logan Nine Fingers
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

2GB is a shitload (that's a technical term) of warning messages!!!
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
yaminids
Premium Member
Premium Member
Posts: 387
Joined: Mon Oct 18, 2004 1:04 pm

Strange error while trying to view log entries

Post by yaminids »

Hello there,

You are right guys about the log crossing the limit. Actually the job was extracting millions of rows from a table and the data was corrupted. For every bad row extracted it was writing an entry into the log. I manually stopped the job and cleared the log

Thanks a lot for your input
Yamini
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

One of the biggest reasons to not run jobs with the Abort after X Warnings option set to Unlimited. :wink:
-craig

"You can never have too many knives" -- Logan Nine Fingers
koolnitz
Participant
Posts: 138
Joined: Wed Sep 07, 2005 5:39 am

Post by koolnitz »

Guys,

Same issue here. The log file for a job exceeded 2GB and on finding this, the developer manually removed the files (using rm command) under RT_LOGnnn folder. He should have used CLEAR.FILE command!! Now the job has been locked. I want to delete this job. Will appreciate if any of you could guide me.

Thanks!
Nitin Jain | India

If everything seems to be going well, you have obviously overlooked something.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

You could take an empty log and copy the files from there over to the directory where the files were removed. That should give you normal access to things again.

Locked is a whole 'nuther issue. Search the forum for UNLOCK for various discussions on handling locked jobs.
-craig

"You can never have too many knives" -- Logan Nine Fingers
koolnitz
Participant
Posts: 138
Joined: Wed Sep 07, 2005 5:39 am

Post by koolnitz »

Bingo.. It worked :D

Thanks Craig!
Nitin Jain | India

If everything seems to be going well, you have obviously overlooked something.
dpegasus
Participant
Posts: 5
Joined: Thu Jan 12, 2006 4:16 am

Post by dpegasus »

chulett wrote:You corrupted the log file, probably by extending it past the 2GB limit. At this point about all you can do is clear it by going to the Administrator and issuing a CLEAR.FILE against RT_LOG1056.

Re ...
Do you know how to recover the contents in the log file? As I need to investige the reason of the large amount of warnings in the log.
kumar_s
Charter Member
Charter Member
Posts: 5245
Joined: Thu Jun 16, 2005 11:00 pm

Post by kumar_s »

If you haven't cleared the file, LIST filename in TCL or adminstrator client, should get the content in the console, and you can use that if you can make it up.
Impossible doesn't mean 'it is not possible' actually means... 'NOBODY HAS DONE IT SO FAR'
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

If you cleared it then its gone. Re-run the same job with warning limit set to maybe 2 or 3. This way you can view the log in the director and see why those warning messages were created. It will also keep the log file size in check.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

If you really are going to get millions of unavoidable warnings - though I can not see any good reason for such a situation - you can resize the log (once it has been cleared) to handle more than 2GB.

Code: Select all

RESIZE RT_LOG1056 * * * 64BIT
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply