Loss of Records and No Errors Generated
Moderators: chulett, rschirm, roy
Loss of Records and No Errors Generated
Hi All,
Myproject has a job that is loading data into an AS400 table. This is an existing job and there were problems eisting due to bugs in previous jobs. But the record counts always used to match. We fixed the bugs in the preceding jobs and lo behold!!! i load 907 records as per DataStage (not a single Warning) and End Users get only 863 records. This has been happening every after. Its a simple job with OCI to Transformer to ODBC(AS400).
Please advice.
Myproject has a job that is loading data into an AS400 table. This is an existing job and there were problems eisting due to bugs in previous jobs. But the record counts always used to match. We fixed the bugs in the preceding jobs and lo behold!!! i load 907 records as per DataStage (not a single Warning) and End Users get only 863 records. This has been happening every after. Its a simple job with OCI to Transformer to ODBC(AS400).
Please advice.
-
- Participant
- Posts: 597
- Joined: Fri Apr 29, 2005 6:19 am
- Location: Singapore
You don't get any warnings...? So have you monitored the job and looked how many records are coming from source and loaded to target?
End Users? Do they see the data through Cognos or BO or any other OLAP tool? Are they using pre-canned reports or adhoc queries against the DW table? If you give us little more information, it will be better.
End Users? Do they see the data through Cognos or BO or any other OLAP tool? Are they using pre-canned reports or adhoc queries against the DW table? If you give us little more information, it will be better.
Kandy
_________________
Try and Try again…You will succeed atlast!!
_________________
Try and Try again…You will succeed atlast!!
I monitored the Job through the Director and the Scheduling tool. There were absolutely no warnings.kandyshandy wrote:You don't get any warnings...? So have you monitored the job and looked how many records are coming from source and loaded to target?
End Users? Do they see the data through Cognos or BO or any other OLAP tool? Are they using pre-canned reports or adhoc queries against the DW table? If you give us little more information, it will be better.
The ODBC settings are set as follows:
Update Action: User-defined SQL
SQL is: Insert into Table Name(Col1, Co2......Coln) Values (?,?,?....?) With None;
I find that the records that were reported as missing still exist with old timestamps. I dont know yet what sort of interface the end users use to query the AS400 but they are able to use SQL. However what my concern is that can DataStage drop records without generating any warning? Where can i look to see if there are rejects or records being dropped?
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Why assume that it is DataStage? DataStage reports the number of records it sends to the ODBC driver. It may be the ODBC driver, it may be the interface into the AS/400 environment, it may be in the tables themselves where the problem occurs.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Participant
- Posts: 597
- Joined: Fri Apr 29, 2005 6:19 am
- Location: Singapore
When i said monitor, I asked you to see the no. of records extracted from source and loaded to target. Right click the job in Director and select monitor. Then you can see the details.I monitored the Job through the Director and the Scheduling tool. There were absolutely no warnings.
If there is a difference between source count and target count, then you might have to check SELECT sql in source stage. Probably, you may have a WHERE clause to ignore records with old timestamp.
Kandy
_________________
Try and Try again…You will succeed atlast!!
_________________
Try and Try again…You will succeed atlast!!
This was the next step that i checked and i even loaded the data into a flat file to analyze it. I did a one time load for the missing 41 records and even then those records in ODBC were not updated at all. DataStage showed that it has successfully loaded all records.kandyshandy wrote:When i said monitor, I asked you to see the no. of records extracted from source and loaded to target. Right click the job in Director and select monitor. Then you can see the details.I monitored the Job through the Director and the Scheduling tool. There were absolutely no warnings.
If there is a difference between source count and target count, then you might have to check SELECT sql in source stage. Probably, you may have a WHERE clause to ignore records with old timestamp.
I assume this was a system generated SQL which was modified when User-Defined SQL was selected. It was always there, totally unchanged.. As i keep stating, i did not touch the job in Production. However this job has started a whole chain of problems for reasons unknown.chulett wrote:With none what?rachit82 wrote:SQL is: Insert into Table Name(Col1, Co2......Coln) Values (?,?,?....?) With None;
What i would like to know is how do i test the ODBC drivers or how do i trace the logs of these loads from the AS400 to see what happened....ray.wurlod wrote:Why assume that it is DataStage? DataStage reports the number of records it sends to the ODBC driver. It may be the ODBC driver, it may be the interface into the AS/400 environment, it may be in the tables themselves where the problem occurs.
-
- Participant
- Posts: 597
- Joined: Fri Apr 29, 2005 6:19 am
- Location: Singapore
When you say 'missing 41 records', you mean to say that they are not inserted but DS shows that they are inserted?This was the next step that i checked and i even loaded the data into a flat file to analyze it. I did a one time load for the missing 41 records and even then those records in ODBC were not updated at all. DataStage showed that it has successfully loaded all records.
Kandy
_________________
Try and Try again…You will succeed atlast!!
_________________
Try and Try again…You will succeed atlast!!
So you don't know what that "with none" clause means? I wondering if it is masking whatever problem you may (or may not) be having if the "none" is related to error reporting. I'm also curious if this table is journaled, that always seems to come up in googled discussions of DB2 sql when this "none" option is mentioned.rachit82 wrote:I assume this was a system generated SQL which was modified when User-Defined SQL was selected. It was always there, totally unchanged.. As i keep stating, i did not touch the job in Production. However this job has started a whole chain of problems for reasons unknown.chulett wrote:With none what?rachit82 wrote:SQL is: Insert into Table Name(Col1, Co2......Coln) Values (?,?,?....?) With None;
Last edited by chulett on Sat Jan 17, 2009 9:21 pm, edited 1 time in total.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
I'm concerned that you are mentioning records not being "updated" when you are allegedly only doing inserts. Also note that DataStage doesn't show that it "successfully loaded all records", all you really know is that a certain number of actions were sent to the database and it reported no errors.rachit82 wrote:This was the next step that i checked and i even loaded the data into a flat file to analyze it. I did a one time load for the missing 41 records and even then those records in ODBC were not updated at all. DataStage showed that it has successfully loaded all records.kandyshandy wrote:When i said monitor, I asked you to see the no. of records extracted from source and loaded to target. Right click the job in Director and select monitor. Then you can see the details.I monitored the Job through the Director and the Scheduling tool. There were absolutely no warnings.
If there is a difference between source count and target count, then you might have to check SELECT sql in source stage. Probably, you may have a WHERE clause to ignore records with old timestamp.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
I can find half of those records in the AS400 table with old timestamps and half were not loaded at all. Still DataStage log says 41 records loaded successfully.kandyshandy wrote:When you say 'missing 41 records', you mean to say that they are not inserted but DS shows that they are inserted?This was the next step that i checked and i even loaded the data into a flat file to analyze it. I did a one time load for the missing 41 records and even then those records in ODBC were not updated at all. DataStage showed that it has successfully loaded all records.
That is an interesting point that even i have been dwelling on. These jobs are atleast 2 years old. From what i know all updates ,to records by any processes in the entire Project, are supposed to be loaded into these tables. If my assumption is correct, then every update that might have daily, monthly or at any frequency should be present in the Table. Have you worked with an AS400 and loaded data into it?chulett wrote:I'm concerned that you are mentioning records not being "updated" when you are allegedly only doing inserts. Also note that DataStage doesn't show that it "successfully loaded all records", all you really know is that a certain number of actions were sent to the database and it reported no errors.rachit82 wrote:This was the next step that i checked and i even loaded the data into a flat file to analyze it. I did a one time load for the missing 41 records and even then those records in ODBC were not updated at all. DataStage showed that it has successfully loaded all records.kandyshandy wrote: When i said monitor, I asked you to see the no. of records extracted from source and loaded to target. Right click the job in Director and select monitor. Then you can see the details.
If there is a difference between source count and target count, then you might have to check SELECT sql in source stage. Probably, you may have a WHERE clause to ignore records with old timestamp.