Problems with Metadata. Column not found.

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
bikan
Premium Member
Premium Member
Posts: 128
Joined: Thu Jun 08, 2006 5:27 am

Problems with Metadata. Column not found.

Post by bikan »

Hi,

I guess its one more day that I found DataStage acting wierd.

I have a simple job,



S.F ---> SPRSE LOOKUP ----> CPY ---2 out links---> Two Db2 Upserts

and the fatal error it gives is

db2Upsert: When preparing operator: When binding partitioner interface: Could not find input field "arr_id_app".

Worst of all its not prinitng OSH SCHEMAS when turned on. I tried DISABLE COMBINATION = TRUE and also its not functioing correctly.

Please dont say that check for the column in your METADATA. The column is present and i can see it. If i cut off the rest of job from copy stage and replace copy by peek, it works fine.

FYI... The column in question is being fetched from Sparese lookup and is decimal 11,0
bcarlson
Premium Member
Premium Member
Posts: 772
Joined: Fri Oct 01, 2004 3:06 pm
Location: Minnesota

Post by bcarlson »

So you know the field is present in the input stream, same case-senstive name and spelling?

Try removing the db2 upserts and dump out to datasets instead. Then look at the datasets and verify that the column names, spelling, and case are exactly what you anticipated.

Fact of the matter is that it is a metadata issue. Work backwards from the failed stage and identify where the column 'fell off' or got renamed.

Brad.
bikan
Premium Member
Premium Member
Posts: 128
Joined: Thu Jun 08, 2006 5:27 am

Post by bikan »

Hi Brad,

Everything is fine with the job but still I am not able to understand why DS is showing such an error or dont know if its the default behaviour of DS.

But I am able to find a work around. I guess its more to do with Orchestrate interface optimisation when using parallel Enterprise DB Stages. In the below job i mentioned, i had two output links going out of copy stage and only one of the output link had the column "arr_id_app" as per my buisness req. But the error did not disappear until i add the above column into the other output link also, which according to my buisness logic is unnecessary since i am not using that column in the upsert query.

Worst of all, my data is not key partitioned on the above column and I have various other columns also which are not common to the two output links. But the error occurs only for the above column. The only speciality of the above column is that the update query happens on this field in one of the links.

Is it that DS builds any indexes on the columns which are used in the where clause of the upsert query and hence is it mandating the columns to be present on both the links, so that it can construct composite index?
Post Reply