Page 1 of 1

User defined Insert query

Posted: Wed Sep 06, 2006 1:37 am
by ShaneMuir
Hi All,

I am having an issue with a user defined sql query using an ORAOCI8 stage.

The query is as follows:

Code: Select all

INSERT INTO 
   TABLE_1
SELECT
        :1 AS TRANSACTION_ID,
        :2 AS TRANSACTION_RELATIONSHIP_CODE,
        :3 AS BEFORE_OR_AFTER_IMAGE,
       CUSTOMER_ID,
       CUSTOMER_TYPE,
       CUSTOMER_STATUS_CODE,
       CUSTOMER_UPDATED
FROM 
   (SELECT 
         DISTINCT TABLE_2.*
    FROM
         CUSTOMER JOIN OPERATOR ON CUSTOMER_ID = OPERATOR_ID
    WHERE
        CONTROLLED_OBJECT_ID = :4
The query above works correctly. The select statement brings back 2 records but the problem is only one of them is being inserted into TABLE_1, when both should be. The job log shows no errors.

Oracle stage settings: Array size: 1000, Rows per transaction: 0

I have run the query in the database and it works fine.

I have also recreated the job in dev environment and it works fine there too. I cannot recreate the same error. :!:

Does anybody have any thoughts as to what is causing this problem? I am thinking about just rebuilding that particular stream to see if it goes away.

thanks in advance
Shane

Posted: Wed Sep 06, 2006 1:53 am
by ray.wurlod
You missed a right parenthesis, but that's just your keyboard!

What's different between development and production? Can you create a job that just does the internal SELECT, perhaps into a text file? What happens with Array Size set to 1? (1000 is ridiculously high for a query that is only going to return two rows.)

Posted: Wed Sep 06, 2006 5:30 pm
by ShaneMuir
ray.wurlod wrote:You missed a right parenthesis, but that's just your keyboard!

What's different between development and production? Can you create a job that just does the internal SELECT, perhaps into a text file? What happens with Array Size set to 1? (1000 is ridiculously high for a query that is only going to return two rows.)
Damned cut and paste - ain't what it used to be :wink:

As far as I am aware there are no differences between dev and prod (other than general data quality). I agree on the Transaction Size, but i am just trying to recreate it (which i can't do).

I was able to use the internal select to create a text file and it has the correct number of records.

I am going to try and load a large amount of records (as in production) and will see how that goes.

Thanks for you help Ray.

Posted: Wed Sep 06, 2006 6:27 pm
by vmcburney
Can you turn on a database trace and get the exact SQL statement being sent to the database? I don't think array size or transaction size will make a difference since you are inserting two rows with a single statement. DataStage sees it as one statement. Could it be the database is accepting one row and rejecting the other for some type of key violation?

Posted: Wed Sep 06, 2006 7:11 pm
by kumar_s
Try to check, with having two stages, One with select query, and other with to Insert the selected Data.

Posted: Thu Sep 07, 2006 1:23 am
by ShaneMuir
vmcburney wrote:Could it be the database is accepting one row and rejecting the other for some type of key violation?
Very unlikely, if not impossible. The nested select statement returns distinct customer ids, and this rounds out the key of the destination table. Also investigation has shown that one of the returned values is just not being inserted, if it was key violation at least it would be in the table to begin with.

The database trace sounds like a good idea though.

Will keep you all posted as I know that you are dying to know the outcome :wink:

Posted: Thu Sep 07, 2006 7:31 pm
by ShaneMuir
Ok the problem has been found and it had nothing to do with the query. :oops:

The query was running as it should have, however there were a few other dependant jobs which were not. It turns out that another job which populated one of the reference tables ran late that day so not all the operator ids had been inserted at that point. :evil: So the select statement rightly only selected the one reference customer, and only inserted the one record. :D

Obviously more controls are required over our scheduling.