Page 1 of 1

Error While Using DB2 Sequence for Surrogate Key

Posted: Wed Feb 27, 2008 10:13 pm
by shawn_2008
Hello,

I am trying to generate surrogate keys using a DB2 sequence, but keep getting an error. The user-defined-SQL in the DB2/UDB Enterprise stage is :

INSERT
INTO
TEST_ACCOUNT_LIST
(TSNO, TS, TSNOTE)
VALUES
(NEXTVAL FOR ACC_SEQ, ORCHESTRATE.TS, ORCHESTRATE.TSNOTE)

and the Error Message is :

test_account_list_db2: When preparing operator: When binding partitioner interface: Could not find input field "tsno".


The job is reading from a sequential file and through a transformer, the data is getting inserted into the TEST_ACCOUNT_LIST table. TS and TSNOTE is coming from the sequential-file, whereas, TSNO is from a db2-sequence (ACC_SEQ). As far as datastage is concerned, I am a complete Newbie :(

Appreciate any Help !
Cheers,
Shawn

Posted: Wed Feb 27, 2008 10:55 pm
by ray.wurlod
If using a sequence, you do NOT supply a key value from DataStage. Therefore the only column passed from DataStage should be the non-key columns - do not have TSNO in the Columns grid.

Posted: Thu Feb 28, 2008 8:38 am
by shawn_2008
Ray,
Thanks for the information. I removed the column names from the grid, but get a different error message :


main_program: Missing field: TSNO in the input dataset.


the job structure is :

[sequential-file] ---> [transformer]--->db2-table

sequential-file structure ( csv) :
TS, TSNOTE

db2-table-structure :
TSNO
TS
TSNOTE

The column Grid in sequential-file, transformer and db2-table is
ts
tsnote

Cheers,
Shawn

Posted: Thu Feb 28, 2008 8:44 am
by ArndW
In which case you are using custom SQL that still contains TSNO

Posted: Thu Feb 28, 2008 9:16 am
by shawn_2008
Andrw,
I am using the following "user defined" insert statement :

INSERT
INTO
TEST_ACCOUNT_LIST
(TSNO, TS, TSNOTE)
VALUES
(NEXTVAL FOR ACC_SEQ, ORCHESTRATE.TS, ORCHESTRATE.TSNOTE)


I know for sure, i must be doing something really stupid :x

What I am trying to achieve here is :
1) Read a sequential file ( two comma separated fields : "ts" and "tsnote")
2) trim the fields (leading/trailing) in the transformer
3) insert into a db2 table ( TSNO (bigint), TS and TSNOTE)
using the values from the sequential file and a db2-sequence ( surrogate key -- for the tsno column).

Thanks for the help !
Shawn

Posted: Thu Feb 28, 2008 9:37 am
by ArndW
Sorry, I just realized I didn't pay attention to your first post. Are you 100% certain that you aren't using TSNO anywhere else in the stage? Perhaps custom update code, or before/after code? This is the only think I can think of at the moment.

Posted: Thu Feb 28, 2008 5:48 pm
by shawn_2008
ArndW, Thanks for the response.
No, I checked the job thoroughly and I can confirm that I am not using TSNO anywhere else.

But, Here's an interesting thing :

When I used a DB2/UDB API stage for the table, the job works fine - I was able to insert records based on the sequence.
( all other jobs at our place is using DB2/UDB Enterprise Stage )

Individually, i can confirm that sequence is working fine and since other jobs are using the Enterprise Stage, i guess that's working fine too.

The only problem is that DB2/UDB Enterprise stage is NOT working with DB2 sequence. I wonder if there is any fundamental difference in the way these two stages handle db2-sequence .


Cheers,
Shawn

Posted: Thu Feb 28, 2008 6:45 pm
by shawn_2008
Got it to work finally after three cans of Red Bull and couple of hours of grappling with a slow server...

:D :D :D

I changed the following setting and the job is running like a charm !

In the DB2/UDB Enterprise Stage,
changed STAGE-->ADVANCED-->EXECUTION MODE to "Sequential" , the default was "Parallel".

I don't have an explanation why it worked, but I will experiment with couple of other settings and do some testing ( db2 --> "no order" clause in the sequence, sequence cache value etc ).

Though it works, it probably negates whatever "parallel" benefit the job might get by running on appropriate hardware.

Cheers,
Shawn

Posted: Fri Feb 29, 2008 12:06 am
by DSDexter

Shawn_2008, I also faced this issue several times. In our project there were a lot of jobs in which we were generating SK's using db2 sequence. We also followed the similar appoach of converting the exection mode of the stage to Sequential mode. I also am not fully aware of this reason. If some bright minds :idea: can throw up some light on this particular workaround, It would be much appreciated by learners like us. :D

Posted: Fri Feb 29, 2008 2:46 am
by stefanfrost1
The error:
test_account_list_db2: When preparing operator: When binding partitioner interface: Could not find input field "tsno".
Resides in a what has been defined as partition key in your database table. You're definition of your table propably has TSNO set as partition key and since you're using a parallell database connection, datastage tries to go directly to the right node. However when defining where to send each row based on the partition key defined in the target table (fetched from the system tables). Datastega can't find the column (TSNO) to partition on, so it failes and puts that warning in the log.

The solution is to either set something else as partition key or create the surrogate keys in datastage sending it to the target table or use sequential mode at delivery as you all suggested.

Posted: Fri Feb 29, 2008 11:36 am
by shawn_2008
Stefan, Your explanation makes perfect sense ! Thank You.
It clears up a number of questions that I had about surrogate key handling.