ds_uvput() error with a valid value

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
RodBarnes
Charter Member
Charter Member
Posts: 182
Joined: Fri Mar 18, 2005 2:10 pm

ds_uvput() error with a valid value

Post by RodBarnes »

I know, I know: this topic has been covered several times before. But I've been unable to find a post that matches what I am experiencing. (Please be gentle :wink: )

I am occasionally getting a fatal error like this on an attempt to insert a record into a table:

Code: Select all

ds_uvput() - Write failed for record id '3211'
Where the '3211' is a varying numeric key value being supplied to the link. The error occurs only in this one job, only on this link, and it only happens once during any given run.

I know some of the causes of this error so every time it has happened, I have checked and confirmed:
  • * The id for the record has an appropriate value. All of these are numbers of max 6 digits, it is a single-value key, and it is set to Integer 10.
    * Using the exact same parameters, I can immediately turn around and re-run the job manually and it will complete successfully with no errors.
    * There is no diskpace issue (> 30GB left).
    * The table is not anywhere near the 4GB limit (< 100K).
Does anyone have any other suggestions of things to check?
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

From what I recall, other messages get logged with ones like that, usually a dump of the entire record - can you post that here? It would help give us some idea of the number of columns in the hashed file and what kind of data you are writing to them.

When it's not being run manually, how it is being run?
-craig

"You can never have too many knives" -- Logan Nine Fingers
RodBarnes
Charter Member
Charter Member
Posts: 182
Joined: Fri Mar 18, 2005 2:10 pm

Post by RodBarnes »

The log begins with the usual messages about "Starting job...", "Environment variables...", "Active stage starting...", and "Write caching disabled", then it shows the error message:

Code: Select all

ODS_ExportSystemCastanet..ins_CASTANET_CHECK.insert: ds_uvput() - Write failed for record id '3211'
The very next message is just the usual "Attempting to Cleanup...", followed by the "...aborted" and then "...finished" messages. None of these provide any other specific info; they are just the generic messages.

The table is used no where but in this job and it serves to eliminate duplicate entries (sometimes the source produces more than one entry with the same key value) so it has only a single column (the key) and it is defined as Integer 10.

Code: Select all

DICT ODS_CASTANET_CHECK    11:44:11am  30 Oct 2009  Page    1

               Type &
Field......... Field. Field........ Conversion.. Column......... Output Depth &
Name.......... Number Definition... Code........ Heading........ Format Assoc..

@ID            D    0                            ODS_CASTANET_CH 10L    S
                                                 ECK
id             D    0               MD0          id              10R    S
@              PH     id ID.SUP

@KEY           PH     id
The job is part of a sequence that is scheduled to run every hour from a batch file that executes "dsjob -run <project> <sequence>". The error happens only occasionally but when it does, I can immediately turn around, run it with the same parameters (pulled from the log) and it succeeds.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Hmmm... this one will be difficult to track down, I'm afraid. If you Reset the aborted job, do you get any additional 'From previous run...' information in the log?
-craig

"You can never have too many knives" -- Logan Nine Fingers
RodBarnes
Charter Member
Charter Member
Posts: 182
Joined: Fri Mar 18, 2005 2:10 pm

Post by RodBarnes »

Yeah, I figured this wasn't going to be simple. I've tried everything I can think of. I've recreated the table but that made no difference.

The reset doesn't provide any additional info:
  • Resetting Job ODS_ExportSystemCastanet.
    Resetting stage ODS_ExportSystemCastanet..xfm.
    ODS_ExportSystemCastanet..ins_CASTANET_CHECK.insert: Write caching disabled
    Finished Resetting stage ODS_ExportSystemCastanet..xfm.
Also, I've confirmed that on none of the occasions where this has happened has it been one of the duplicate records. It is always some other random key value. And, it has never happened more than once in a single run. Of course, it is a fatal error so that kills the job -- so there may have been other instances if it could've gotten to them.

Though there are other columns being written out to the export file (sequential) the only thing going into this table is the id (integer) in order to eliminate subsequent entries with the same id via a lookup against the table. I've used this mechanism in other jobs with no previous issues at all.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

As have I. And the only time I've had issues, they were easily isolated and repeatable, not something that goes away on a rerun. That seems to imply a resource issue more than anything else, but only in this job? :?

I'm gonna need to ponder this, not sure where to go next. Hopefully someone else will have a thought or two in the meantime.
-craig

"You can never have too many knives" -- Logan Nine Fingers
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Is this table created by a Hashed File stage, a UniVerse stage, a CREATE.FILE statement in TCL, a CREATE TABLE statement in TCL or a mkdbfile command?

Check that there is no non-printing character as part of the key value.
The id column has an inferred data type of Integer, but a non-printing character would render the value invalid as an Integer.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
RodBarnes
Charter Member
Charter Member
Posts: 182
Joined: Fri Mar 18, 2005 2:10 pm

Post by RodBarnes »

The table is created by a Hashed File stage as part of the job. I had considered non-printing characters since it is an obvious candidate but didn't see how that could be given that the exact same criteria works successfully when used immediately after the failure.

This job is very simple and does only the following:
  • * Runs a SELECT against a SQL Server database.
    * Check's the key value against the key column of the hashed table.
    * If it doesn't exist, write it to the output sequential file and record it in the hashed table for subsequent checking.
Yet, given Occam's razor, it certainly seems the logical conclusion is that something's wrong with the key value. That would mean that the key value would have to change from one run and the next. I'll see what I can come up with to capture this.
RodBarnes
Charter Member
Charter Member
Posts: 182
Joined: Fri Mar 18, 2005 2:10 pm

Post by RodBarnes »

Looking through each stage of the sequence I found there was actually another job inserting into this same hashed table. This was a cut/paste error: the second job had been created from the first and I neglected to create a separate check table. Since they were both attempting to update the same hashed table, one with all the existing IDs and one with only the new IDs, it appears they were walking on each other, one attempting to insert a new record of key 1234 and the other attempting to insert the key of the same value, but finding the record locked.

I have high confidence that this is resolved so marking it such.
Last edited by RodBarnes on Mon Nov 02, 2009 1:33 pm, edited 1 time in total.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

8)
-craig

"You can never have too many knives" -- Logan Nine Fingers
Post Reply