Page 1 of 2

result rows are not showing up-total mismatch

Posted: Mon Feb 09, 2009 4:56 am
by qutesanju
hi all

I have one job designed like this

one stored procedure --> transformer --> insert data into third party plugin (like meridium)

here the job complets with 100 rows but when i check it in a table which is in meridium (thru third party plugin) i found that only 90 rows are coming

I checked the key fields also.
But still the same problem occurs.

can anybody suggest on this scnerio

Posted: Mon Feb 09, 2009 8:02 am
by chulett
How can anyone suggest anything without knowledge of your "third pary plugin"? You'd need to involve your support provider for that.

Posted: Mon Feb 09, 2009 9:55 am
by Sainath.Srinivasan
Are there any logs? Did you try to find the missing rows?

in log file everything looks good

Posted: Fri Mar 06, 2009 4:52 am
by qutesanju
Hi sainath,
I checked log file it looks normal log file .
and there are no warnings also.

I m trying to give job output to sequential file for cross checking.
means flow will be ODBC stage --> transformer --> sequential file.

Sainath.Srinivasan wrote:Are there any logs? Did you try to find the missing rows?

Posted: Fri Mar 06, 2009 1:09 pm
by ray.wurlod
The figure that DataStage reports is the number of rows that it sent to the application.

If your third-party application has discarded some of those I would expect it (the application) to have its own logs.

If this is a custom-written plug-in stage type, its author may or may not have provided for detection and reporting of rejects.

Posted: Fri Apr 03, 2009 8:44 am
by qutesanju
I tried an alternate way as---

inserted all the rows from procedure into SEQUENTIAL file.
and then inserted into third party plugin ..but still it's not loading all the rows

is there any alternate way for this?
is there any difference if i used HASHED file instead of sequential file?

Posted: Fri Apr 03, 2009 9:00 am
by chulett
Yes, you'd just make it more... difficult... then you'd have to worry about unique keys and destructive overwrite and all that. Stick with the flat file.

Posted: Mon Apr 06, 2009 8:02 am
by qutesanju
I tried it with sequential file as

first from stored procedure I took records and inserted it into sequential file .
then dumped records of above sequential file into third party plugin/database

and I linked above two child jobs with a sequencer so that 2nd job will be called after 1st job .

but in sequential file it's getting 1000 rows and in final meridium plugin table (third party pluging) the row count is 900.

so I amazed where rest 100 rocords are gone?
I cross checked key filelds for table also

can u pls suggest remedy for this

chulett wrote:Yes, you'd just make it more... difficult... then you'd have to worry about unique keys and destructive overwrite and all that. Stick with the flat file. ...

Posted: Mon Apr 06, 2009 8:41 am
by Sainath.Srinivasan
Did you identify which records are missing and see whether there is any pattern that may case them to be dropped ?

Also what happens if you load only the 900 rows that were successfully loaded.

Maybe your 3rd party s/w is working in 90% efficiency.

Posted: Mon Apr 06, 2009 8:47 am
by qutesanju
I have to check the why records are missing .
I tried to increas no of keys in input also so as to ensure more records are getting inserted

I will check for the pattern.
But there is no problem with third party pluging..(it's moreoversame like inserting rows into database directly with a plugin)
as earlier i tried to insert rows with this plugin there was no issue..
-----------------------------------
Sainath.Srinivasan wrote:Did you identify which records are missing and see whether there is any pattern that may case them to be dropped ?

Also what happens if you load only the 900 rows that were successfully loaded.

Maybe your 3rd party s/w is working in 90% efficiency.

Posted: Mon Apr 06, 2009 11:09 am
by girija
I think this is an issue of your "third party plugin" (if you consider this is an issue). Did you check it in your output/target rows? It may use the upsert (if your target is a database), instead of insert it updates the row according to the key. Since you get 1000 rows in your seq file, there is no problem in your job.

Posted: Mon Apr 06, 2009 1:08 pm
by ray.wurlod
Are all the rows making it into the sequential file?

If not, is there a constraint expression in the Transformer stage?

Posted: Tue Apr 07, 2009 1:08 am
by qutesanju
or is there any alternative for overall job design?

my requirement is to take data from Stored Procedure and insert this data into third party plugin table (like meridium-table- which is same as database table)

Posted: Tue Apr 07, 2009 1:19 am
by qutesanju
yeah!!!
all rows from stored procedure are getting inerted into sequential file

as row count of stored procedure and seq file is matching

but row count of seq file and third party plugin is not matching (there are getting less rows inserted into third party plugin)

-----------------------------------------------------------------------------
ray.wurlod wrote:Are all the rows making it into the sequential file?

If not, is there a constraint expression in the Transformer stage? ...

Posted: Tue Apr 07, 2009 6:08 am
by Sainath.Srinivasan
To solve, you must find the missing rows and the reason why they are being ignored.

You can
a.) clear the 3rd party s/w and insert the 900 rows successfully loaded again.
b.) load 1 row at a time and see which row fails.