DB2 Connector stage using user defined query.

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
sreddy
Participant
Posts: 144
Joined: Sun Oct 21, 2007 9:13 am

DB2 Connector stage using user defined query.

Post by sreddy »

Hi

I am using DB2 Connectore stage as source and target as sequential file stage.
My source database is DB2 UDB 9.7, i wrote a user defined query using Rank Function, the result of the Rank capture in to a alias column. I am using the alias column in my Select column list.

My problem is when i try to read the data from the DB2 database using DB2 Connector stage . it is giving error and job is aborted.

DB2_Connector_114: Schema reconciliation detected a size mismatch for column Name_ID. When reading database column INT64 into column INT32, truncation, loss of precision or data corruption can occur. (CC_DBSchemaRules::reportSizeMismatch, file CC_DBSchemaRules.cpp, line 1,692)
main_program: Could not check all operators because of previous error(s)
Error when checking operator: Could not find input field "Name_ID".
DB2_Connector_114: When checking operator: The modify operator has a binding for the non-existent output field "Name_ID".
main_program: Creation of a step finished with status = FAILED.

DB2_Connector_114: When checking operator: The modify operator keeps field "Name_ID" which is not present in the output interface.
============================
SELECT
Name.Name_ID as Name_ID
FROM STG_TTP as TP,(SELECT LAST_NAME,FIRST_NAME,MIDDLE,SUFFIX,DOB,
RANK() OVER(ORDER BY LAST_NAME,FIRST_NAME,MIDDLE,SUFFIX,DOB DESC) as Name_ID
FROM STG_TP) as Name
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Have you tried declaring NAME_ID as a bigint?
sreddy
Participant
Posts: 144
Joined: Sun Oct 21, 2007 9:13 am

Post by sreddy »

Arnd

Initially i tried with varchar, later i tried with integer. still i am getting the same error.

this is the first time i am using Rank function returned value as part of user defined query.

-----------------------------------
can you correct me if i made any mistake as part of query.
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

try with bigint, as in my first post.
Kryt0n
Participant
Posts: 584
Joined: Wed Jun 22, 2005 7:28 pm

Post by Kryt0n »

As with ArndW, make sure you use BigInt to start with, sometimes DataStage throws out superfluous errors based on a previous error, fixing one ends up fixing them all.

On a side note, is that your full query? You aren't joining your two tables so going to get a cartesian. Run the query in a sql editor window first and confirm you are getting what you want.
chandra.shekhar@tcs.com
Premium Member
Premium Member
Posts: 353
Joined: Mon Jan 17, 2011 5:03 am
Location: Mumbai, India

Post by chandra.shekhar@tcs.com »

Hello sreddy,

In Db2 connector stage ,set the below property as "NO"

In Schema Reconciliation
|
|
--> fail on size mismatch Set As NO.
--> fail in type mismatch Set As NO.

Compile job and rerun .

Kindly share the feedback.

Thanks.
Thanx and Regards,
ETL User
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

What Chandra has suggested is a way to avoid seeing the error. It is much better to fix the problem and avoid the overhead associated with doing an unnecessary cast/conversion. There is a reason that the default for this setting is "OFF" and this is a good example of why; namely the setting should only be turned to "ON" when the developer knows the cause, what the error implies and has no other way of avoiding the message.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

+1

:wink:
-craig

"You can never have too many knives" -- Logan Nine Fingers
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

I couldn't think of an example yesterday, but have a good one today - In the database I have columns defined as SQL-Server type "timestamp2" with 7 digits of microsecond precision, but DataStage only allows 6 digits. With the setting of schema reconciliation set to default, my job aborts. I can't change the database nor can I explicitly cast this column in the SQL (since we are using generic jobs) so I explicitly need to set my "fail on size mismatch" to "NO" (but I leave the type mismatch setting on "YES"). I still get an ugly warning but can deprecate that to an informational level with a job-level message handler.
Post Reply