result rows are not showing up-total mismatch

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

qutesanju
Participant
Posts: 373
Joined: Tue Aug 26, 2008 4:52 am

result rows are not showing up-total mismatch

Post by qutesanju »

hi all

I have one job designed like this

one stored procedure --> transformer --> insert data into third party plugin (like meridium)

here the job complets with 100 rows but when i check it in a table which is in meridium (thru third party plugin) i found that only 90 rows are coming

I checked the key fields also.
But still the same problem occurs.

can anybody suggest on this scnerio
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

How can anyone suggest anything without knowledge of your "third pary plugin"? You'd need to involve your support provider for that.
-craig

"You can never have too many knives" -- Logan Nine Fingers
Sainath.Srinivasan
Participant
Posts: 3337
Joined: Mon Jan 17, 2005 4:49 am
Location: United Kingdom

Post by Sainath.Srinivasan »

Are there any logs? Did you try to find the missing rows?
qutesanju
Participant
Posts: 373
Joined: Tue Aug 26, 2008 4:52 am

in log file everything looks good

Post by qutesanju »

Hi sainath,
I checked log file it looks normal log file .
and there are no warnings also.

I m trying to give job output to sequential file for cross checking.
means flow will be ODBC stage --> transformer --> sequential file.

Sainath.Srinivasan wrote:Are there any logs? Did you try to find the missing rows?
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

The figure that DataStage reports is the number of rows that it sent to the application.

If your third-party application has discarded some of those I would expect it (the application) to have its own logs.

If this is a custom-written plug-in stage type, its author may or may not have provided for detection and reporting of rejects.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
qutesanju
Participant
Posts: 373
Joined: Tue Aug 26, 2008 4:52 am

Post by qutesanju »

I tried an alternate way as---

inserted all the rows from procedure into SEQUENTIAL file.
and then inserted into third party plugin ..but still it's not loading all the rows

is there any alternate way for this?
is there any difference if i used HASHED file instead of sequential file?
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Yes, you'd just make it more... difficult... then you'd have to worry about unique keys and destructive overwrite and all that. Stick with the flat file.
-craig

"You can never have too many knives" -- Logan Nine Fingers
qutesanju
Participant
Posts: 373
Joined: Tue Aug 26, 2008 4:52 am

Post by qutesanju »

I tried it with sequential file as

first from stored procedure I took records and inserted it into sequential file .
then dumped records of above sequential file into third party plugin/database

and I linked above two child jobs with a sequencer so that 2nd job will be called after 1st job .

but in sequential file it's getting 1000 rows and in final meridium plugin table (third party pluging) the row count is 900.

so I amazed where rest 100 rocords are gone?
I cross checked key filelds for table also

can u pls suggest remedy for this

chulett wrote:Yes, you'd just make it more... difficult... then you'd have to worry about unique keys and destructive overwrite and all that. Stick with the flat file. ...
Sainath.Srinivasan
Participant
Posts: 3337
Joined: Mon Jan 17, 2005 4:49 am
Location: United Kingdom

Post by Sainath.Srinivasan »

Did you identify which records are missing and see whether there is any pattern that may case them to be dropped ?

Also what happens if you load only the 900 rows that were successfully loaded.

Maybe your 3rd party s/w is working in 90% efficiency.
qutesanju
Participant
Posts: 373
Joined: Tue Aug 26, 2008 4:52 am

Post by qutesanju »

I have to check the why records are missing .
I tried to increas no of keys in input also so as to ensure more records are getting inserted

I will check for the pattern.
But there is no problem with third party pluging..(it's moreoversame like inserting rows into database directly with a plugin)
as earlier i tried to insert rows with this plugin there was no issue..
-----------------------------------
Sainath.Srinivasan wrote:Did you identify which records are missing and see whether there is any pattern that may case them to be dropped ?

Also what happens if you load only the 900 rows that were successfully loaded.

Maybe your 3rd party s/w is working in 90% efficiency.
girija
Participant
Posts: 89
Joined: Fri Mar 24, 2006 1:51 pm
Location: Hartford

Post by girija »

I think this is an issue of your "third party plugin" (if you consider this is an issue). Did you check it in your output/target rows? It may use the upsert (if your target is a database), instead of insert it updates the row according to the key. Since you get 1000 rows in your seq file, there is no problem in your job.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Are all the rows making it into the sequential file?

If not, is there a constraint expression in the Transformer stage?
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
qutesanju
Participant
Posts: 373
Joined: Tue Aug 26, 2008 4:52 am

Post by qutesanju »

or is there any alternative for overall job design?

my requirement is to take data from Stored Procedure and insert this data into third party plugin table (like meridium-table- which is same as database table)
qutesanju
Participant
Posts: 373
Joined: Tue Aug 26, 2008 4:52 am

Post by qutesanju »

yeah!!!
all rows from stored procedure are getting inerted into sequential file

as row count of stored procedure and seq file is matching

but row count of seq file and third party plugin is not matching (there are getting less rows inserted into third party plugin)

-----------------------------------------------------------------------------
ray.wurlod wrote:Are all the rows making it into the sequential file?

If not, is there a constraint expression in the Transformer stage? ...
Sainath.Srinivasan
Participant
Posts: 3337
Joined: Mon Jan 17, 2005 4:49 am
Location: United Kingdom

Post by Sainath.Srinivasan »

To solve, you must find the missing rows and the reason why they are being ignored.

You can
a.) clear the 3rd party s/w and insert the 900 rows successfully loaded again.
b.) load 1 row at a time and see which row fails.
Post Reply