rows mismatch while loading into db2 table using odbc stage
Moderators: chulett, rschirm, roy
rows mismatch while loading into db2 table using odbc stage
Hi,
My source is db2 udb and target is alos db2 udb
My job extracts data using db2 udb api stage and loads using odbc stage into the target.
This jobs also captures rejected and filtered rows using constrains writeen in the transformer.
The issue is while loading into the target some of the rows are missed,
though performance statistics shows correct result.
e.g extracted rows=200
rejected rows due to data issue=50
but loaded into target =100( issue)
filtered rows=0(no filter condition, fine)
now 50 rows missing, but performance statistics shows for target link shows 150 rows
Job fininshed successfully, but logs show some phantom related warning.
any helps will be highly appreciated
thanks
Rafiq
My source is db2 udb and target is alos db2 udb
My job extracts data using db2 udb api stage and loads using odbc stage into the target.
This jobs also captures rejected and filtered rows using constrains writeen in the transformer.
The issue is while loading into the target some of the rows are missed,
though performance statistics shows correct result.
e.g extracted rows=200
rejected rows due to data issue=50
but loaded into target =100( issue)
filtered rows=0(no filter condition, fine)
now 50 rows missing, but performance statistics shows for target link shows 150 rows
Job fininshed successfully, but logs show some phantom related warning.
any helps will be highly appreciated
thanks
Rafiq
-
- Charter Member
- Posts: 822
- Joined: Sat Sep 17, 2005 5:25 pm
- Location: USA
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Hi DSguru2B and Ray,
Thanks for reply.
I have reset and ran many times, same issue i am getting.
There is no error, only warning message.
The log also shows correct rows,
As per log it shows 168667 rows loaded into target table, but while checking using select count statement, it shows 168615 rows.
52 rows is mismatched.
I don't understand what is the issue.
Please help.
Log looks like following(partly)
POD_Elc_Load_Acccode_Dtls..TRN_Apply_Business_Logic.LNK_Write_Acccode_Dtls: DSD.BCIPut call to OCONV failed.
Column: ACCESS_CODE, NLS mapname: MS1252, Column data (after conversion): ???, Column data (in hex, before conversion) FFFFFF
POD_Elc_Load_Acccode_Dtls..TRN_Apply_Business_Logic: At row 1, link 'LNK_Write_Rejects', while processing column 'ELC_ACC_CDE', value truncated to '???'.
POD_Elc_Load_Acccode_Dtls..TRN_Apply_Business_Logic: DSD.StageRun Active stage finishing.
168669 rows read from LNK_Read_Elc_Cnect
168669 rows read from LNK_Read_Org_Code
168667 rows written to LNK_Write_Acccode_Dtls
2 rows written to LNK_Write_Rejects
0 rows written to LNK_Write_Filtered
18.406 CPU seconds used, 26.110 seconds elapsed.
DataStage Job 197 Phantom 5468
Program "DSD.BCIPut": Line 507, Variable previously undefined. Zero length string used.
Program "DSD.BCIPut": Line 392, Variable "ROWS.WRITTEN" previously undefined. Zero used.
Program "DSD.BCIPut": Line 507, Variable previously undefined. Zero length string used.
Program "DSD.BCIPut": Line 392, Variable "ROWS.WRITTEN" previously undefined. Zero used.
DataStage Phantom Finished
Thanks for reply.
I have reset and ran many times, same issue i am getting.
There is no error, only warning message.
The log also shows correct rows,
As per log it shows 168667 rows loaded into target table, but while checking using select count statement, it shows 168615 rows.
52 rows is mismatched.
I don't understand what is the issue.
Please help.
Log looks like following(partly)
POD_Elc_Load_Acccode_Dtls..TRN_Apply_Business_Logic.LNK_Write_Acccode_Dtls: DSD.BCIPut call to OCONV failed.
Column: ACCESS_CODE, NLS mapname: MS1252, Column data (after conversion): ???, Column data (in hex, before conversion) FFFFFF
POD_Elc_Load_Acccode_Dtls..TRN_Apply_Business_Logic: At row 1, link 'LNK_Write_Rejects', while processing column 'ELC_ACC_CDE', value truncated to '???'.
POD_Elc_Load_Acccode_Dtls..TRN_Apply_Business_Logic: DSD.StageRun Active stage finishing.
168669 rows read from LNK_Read_Elc_Cnect
168669 rows read from LNK_Read_Org_Code
168667 rows written to LNK_Write_Acccode_Dtls
2 rows written to LNK_Write_Rejects
0 rows written to LNK_Write_Filtered
18.406 CPU seconds used, 26.110 seconds elapsed.
DataStage Job 197 Phantom 5468
Program "DSD.BCIPut": Line 507, Variable previously undefined. Zero length string used.
Program "DSD.BCIPut": Line 392, Variable "ROWS.WRITTEN" previously undefined. Zero used.
Program "DSD.BCIPut": Line 507, Variable previously undefined. Zero length string used.
Program "DSD.BCIPut": Line 392, Variable "ROWS.WRITTEN" previously undefined. Zero used.
DataStage Phantom Finished
-
- Charter Member
- Posts: 822
- Joined: Sat Sep 17, 2005 5:25 pm
- Location: USA
Rafi, It seems you are doing some transformations on the ACCESS_CODE and ELC_ACC_CDE columns and the data in those field are either not in proper format or having some wierd data which your target stage is not recognising. what transformations are you doing?
I haven't failed, I've found 10,000 ways that don't work.
Thomas Alva Edison(1847-1931)
Thomas Alva Edison(1847-1931)
Aslam,us1aslam1us wrote:Rafi, It seems you are doing some transformations on the ACCESS_CODE and ELC_ACC_CDE columns and the data in those field are either not in proper format or having some wierd data which your target stage is not recognising. what transformations are you doing?
Thanks for response!
In transformer i am checking null that's it.
Due to some funny character source column ELC_ACC_CDE, two rows are rejected after truncating those funny charcter.
if isnull(LNK_Read_Elc_Cnect.ELC_ACC_CDE) then " " else LNK_Read_Elc_Cnect.ELC_ACC_CDE
but i don't understand where rest 52 rows go??
-
- Charter Member
- Posts: 822
- Joined: Sat Sep 17, 2005 5:25 pm
- Location: USA
Ok. Tell us what is the datatype for this field in both source and target including the lengths and nullable/not nullable? For testing purpose try replacing the target stage with sequential file and check what happens.
I haven't failed, I've found 10,000 ways that don't work.
Thomas Alva Edison(1847-1931)
Thomas Alva Edison(1847-1931)
Aslam,us1aslam1us wrote:Ok. Tell us what is the datatype for this field in both source and target including the lengths and nullable/not nullable? For testing purpose try replacing the target stage with sequential file and check what happens.
Thanks for response
Both source and target fields ar varchar.
Source field is nullable and target field is not nullable
In the transformer i check for nullability. if it is null then populate " " else source field.
What i am guessing there is some data issue.
Any other idea? plz help
Regards,
Rafiq
Thanks everybody for help!rafik2k wrote:Aslam,us1aslam1us wrote:Ok. Tell us what is the datatype for this field in both source and target including the lengths and nullable/not nullable? For testing purpose try replacing the target stage with sequential file and check what happens.
Thanks for response
Both source and target fields ar varchar.
Source field is nullable and target field is not nullable
In the transformer i check for nullability. if it is null then populate " " else source field.
What i am guessing there is some data issue.
Any other idea? plz help.
Regards,
Rafiq
This issue sorted out.
I just changed NLS from MS1252 to UTF8.
Now it's working fine.
Due to some junk character in the source, some of the rows are truncating.
using UTF8 this has been sorted out.
Thanks
Rafik