rows mismatch while loading into db2 table using odbc stage

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
rafik2k
Participant
Posts: 182
Joined: Wed Nov 23, 2005 1:36 am
Location: Sydney

rows mismatch while loading into db2 table using odbc stage

Post by rafik2k »

Hi,
My source is db2 udb and target is alos db2 udb
My job extracts data using db2 udb api stage and loads using odbc stage into the target.

This jobs also captures rejected and filtered rows using constrains writeen in the transformer.

The issue is while loading into the target some of the rows are missed,
though performance statistics shows correct result.

e.g extracted rows=200
rejected rows due to data issue=50
but loaded into target =100( issue)
filtered rows=0(no filter condition, fine)

now 50 rows missing, but performance statistics shows for target link shows 150 rows

Job fininshed successfully, but logs show some phantom related warning.
any helps will be highly appreciated

thanks
Rafiq
us1aslam1us
Charter Member
Charter Member
Posts: 822
Joined: Sat Sep 17, 2005 5:25 pm
Location: USA

Post by us1aslam1us »

By any chance are you trying to load duplicate key values on any Unique constraint fields. Can you the warning messages here?
I haven't failed, I've found 10,000 ways that don't work.
Thomas Alva Edison(1847-1931)
rafik2k
Participant
Posts: 182
Joined: Wed Nov 23, 2005 1:36 am
Location: Sydney

Post by rafik2k »

I am selecting distinct rows from the source itself.
and there is no warning regarding unique constraint the logs
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

Can you post the exact error message? Also reset the job and see if any additional messages pop up in the log file, specifically "From previous run.."
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

The link row count shows the number of rows sent, not necessarily the number of records successfully loaded.

That's why we have reject handling.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
rafik2k
Participant
Posts: 182
Joined: Wed Nov 23, 2005 1:36 am
Location: Sydney

Post by rafik2k »

Hi DSguru2B and Ray,
Thanks for reply.
I have reset and ran many times, same issue i am getting.
There is no error, only warning message.

The log also shows correct rows,
As per log it shows 168667 rows loaded into target table, but while checking using select count statement, it shows 168615 rows.

52 rows is mismatched.
I don't understand what is the issue.

Please help.

Log looks like following(partly)

POD_Elc_Load_Acccode_Dtls..TRN_Apply_Business_Logic.LNK_Write_Acccode_Dtls: DSD.BCIPut call to OCONV failed.
Column: ACCESS_CODE, NLS mapname: MS1252, Column data (after conversion): ???, Column data (in hex, before conversion) FFFFFF

POD_Elc_Load_Acccode_Dtls..TRN_Apply_Business_Logic: At row 1, link 'LNK_Write_Rejects', while processing column 'ELC_ACC_CDE', value truncated to '???'.

POD_Elc_Load_Acccode_Dtls..TRN_Apply_Business_Logic: DSD.StageRun Active stage finishing.
168669 rows read from LNK_Read_Elc_Cnect
168669 rows read from LNK_Read_Org_Code
168667 rows written to LNK_Write_Acccode_Dtls
2 rows written to LNK_Write_Rejects
0 rows written to LNK_Write_Filtered
18.406 CPU seconds used, 26.110 seconds elapsed.

DataStage Job 197 Phantom 5468
Program "DSD.BCIPut": Line 507, Variable previously undefined. Zero length string used.
Program "DSD.BCIPut": Line 392, Variable "ROWS.WRITTEN" previously undefined. Zero used.
Program "DSD.BCIPut": Line 507, Variable previously undefined. Zero length string used.
Program "DSD.BCIPut": Line 392, Variable "ROWS.WRITTEN" previously undefined. Zero used.
DataStage Phantom Finished
us1aslam1us
Charter Member
Charter Member
Posts: 822
Joined: Sat Sep 17, 2005 5:25 pm
Location: USA

Post by us1aslam1us »

Rafi, It seems you are doing some transformations on the ACCESS_CODE and ELC_ACC_CDE columns and the data in those field are either not in proper format or having some wierd data which your target stage is not recognising. what transformations are you doing?
I haven't failed, I've found 10,000 ways that don't work.
Thomas Alva Edison(1847-1931)
rafik2k
Participant
Posts: 182
Joined: Wed Nov 23, 2005 1:36 am
Location: Sydney

Post by rafik2k »

us1aslam1us wrote:Rafi, It seems you are doing some transformations on the ACCESS_CODE and ELC_ACC_CDE columns and the data in those field are either not in proper format or having some wierd data which your target stage is not recognising. what transformations are you doing?
Aslam,
Thanks for response!
In transformer i am checking null that's it.
Due to some funny character source column ELC_ACC_CDE, two rows are rejected after truncating those funny charcter.

if isnull(LNK_Read_Elc_Cnect.ELC_ACC_CDE) then " " else LNK_Read_Elc_Cnect.ELC_ACC_CDE

but i don't understand where rest 52 rows go??
us1aslam1us
Charter Member
Charter Member
Posts: 822
Joined: Sat Sep 17, 2005 5:25 pm
Location: USA

Post by us1aslam1us »

Ok. Tell us what is the datatype for this field in both source and target including the lengths and nullable/not nullable? For testing purpose try replacing the target stage with sequential file and check what happens.
I haven't failed, I've found 10,000 ways that don't work.
Thomas Alva Edison(1847-1931)
rafik2k
Participant
Posts: 182
Joined: Wed Nov 23, 2005 1:36 am
Location: Sydney

Post by rafik2k »

us1aslam1us wrote:Ok. Tell us what is the datatype for this field in both source and target including the lengths and nullable/not nullable? For testing purpose try replacing the target stage with sequential file and check what happens.
Aslam,
Thanks for response
Both source and target fields ar varchar.
Source field is nullable and target field is not nullable
In the transformer i check for nullability. if it is null then populate " " else source field.

What i am guessing there is some data issue.

Any other idea? plz help

Regards,
Rafiq
rafik2k
Participant
Posts: 182
Joined: Wed Nov 23, 2005 1:36 am
Location: Sydney

Post by rafik2k »

rafik2k wrote:
us1aslam1us wrote:Ok. Tell us what is the datatype for this field in both source and target including the lengths and nullable/not nullable? For testing purpose try replacing the target stage with sequential file and check what happens.
Aslam,
Thanks for response
Both source and target fields ar varchar.
Source field is nullable and target field is not nullable
In the transformer i check for nullability. if it is null then populate " " else source field.

What i am guessing there is some data issue.

Any other idea? plz help.
Regards,
Rafiq
Thanks everybody for help!
This issue sorted out.
I just changed NLS from MS1252 to UTF8.
Now it's working fine.

Due to some junk character in the source, some of the rows are truncating.
using UTF8 this has been sorted out.

Thanks
Rafik
Post Reply