ODBC - delete then insert
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 13
- Joined: Thu May 28, 2015 3:59 am
- Location: Pune
ODBC - delete then insert
Hi Friends,
Requirement -: We are loading data into oracle database based on templated_id so before loading data need to delete recording for same templated_id.
Implement -: in ODBC stage i choose write mode - "delete then insert"
insert statement "INSERT INTO "O2K"."CERT_TEMPLATE_BINDING"("CERTIFICATE_TEMPLATE_ID","TEMPLATE_VARIABLE","FIELD_NAME") VALUES(ORCHESTRATE.CERTIFICATE_TEMPLATE_ID,ORCHESTRATE.TEMPLATE_VARIABLE,ORCHESTRATE.FIELD_NAME)"
delete statement "delete from O2K.CERT_TEMPLATE_BINDING where CERTIFICATE_TEMPLATE_ID=ORCHESTRATE.CERTIFICATE_TEMPLATE_ID"
Status - It's working fine but problem with data.
For example- CERTIFICATE_TEMPLATE_ID=449 i have 40 records in CERT_TEMPLATE_BINDING table so when i will run the job first it will delete all 40 records then insert first node records
(suppose we have total 50 records then it will insert first node data as 25 records) and then again execute delete statement and delete first nodes records data and insert second nodes record data.
After run the job, in target table only 25 records (second node)
my solution but not work :-
1) my thinking if it will run (ODBC) on sequential mode or on single node. this idea failed.
Please friend give me suggestion how can delete old records nad insert new records base on CERTIFICATE_TEMPLATE_ID
Requirement -: We are loading data into oracle database based on templated_id so before loading data need to delete recording for same templated_id.
Implement -: in ODBC stage i choose write mode - "delete then insert"
insert statement "INSERT INTO "O2K"."CERT_TEMPLATE_BINDING"("CERTIFICATE_TEMPLATE_ID","TEMPLATE_VARIABLE","FIELD_NAME") VALUES(ORCHESTRATE.CERTIFICATE_TEMPLATE_ID,ORCHESTRATE.TEMPLATE_VARIABLE,ORCHESTRATE.FIELD_NAME)"
delete statement "delete from O2K.CERT_TEMPLATE_BINDING where CERTIFICATE_TEMPLATE_ID=ORCHESTRATE.CERTIFICATE_TEMPLATE_ID"
Status - It's working fine but problem with data.
For example- CERTIFICATE_TEMPLATE_ID=449 i have 40 records in CERT_TEMPLATE_BINDING table so when i will run the job first it will delete all 40 records then insert first node records
(suppose we have total 50 records then it will insert first node data as 25 records) and then again execute delete statement and delete first nodes records data and insert second nodes record data.
After run the job, in target table only 25 records (second node)
my solution but not work :-
1) my thinking if it will run (ODBC) on sequential mode or on single node. this idea failed.
Please friend give me suggestion how can delete old records nad insert new records base on CERTIFICATE_TEMPLATE_ID
Thanks and Regards,
Vinay Sharma
Sr. BI Consultant
Cell:(IND) 952-766-1393
Vinay Sharma
Sr. BI Consultant
Cell:(IND) 952-766-1393
-
- Premium Member
- Posts: 1735
- Joined: Thu Mar 01, 2007 5:44 am
- Location: Troy, MI
hash partition on CERTIFICATE_TEMPLATE_ID. This will ensure all the 50 records for CERTIFICATE_TEMPLATE_ID=449 in the same partition.
However it can only be guaranteed when all 50 records go as same transaction, so you may have to set the transaction size or basically the commit interval to 0.
However it can only be guaranteed when all 50 records go as same transaction, so you may have to set the transaction size or basically the commit interval to 0.
Last edited by priyadarshikunal on Mon Jul 06, 2015 6:07 am, edited 1 time in total.
Priyadarshi Kunal
Genius may have its limitations, but stupidity is not thus handicapped.
Genius may have its limitations, but stupidity is not thus handicapped.
-
- Premium Member
- Posts: 1735
- Joined: Thu Mar 01, 2007 5:44 am
- Location: Troy, MI
However I prefer to separate deletes with inserts and get it done as two separate steps.
one question, If you need to connect to oracle, why not use a oracle connector instead?
What happened when you ran that job on one node?
one question, If you need to connect to oracle, why not use a oracle connector instead?
What happened when you ran that job on one node?
Priyadarshi Kunal
Genius may have its limitations, but stupidity is not thus handicapped.
Genius may have its limitations, but stupidity is not thus handicapped.
Basically requirements is to execute multiple SQL statements(Delete and Insert) with a single Commit. Same as previous poster, I will advise using an Oracle Connector rather than an ODBC stage.
This requirements can be supported with multiple designs. Some of which are:
1) Local transactions with database connectors.
2) Server job, utilizing option of single commit.
3) External PL/SQL Package.
4) Distributed Transaction Stage.
Depending on business need, I have seen #1 thru #3 being implemented. I have never tried #4.
My preferred approach for a "reasonable" amount of incoming data is #1:
1. Added a transformer before the Database stage to copy incoming data stream to two output streams (one for insert and the other one for delete).
2. Remove duplicates from "delete stream" based on the key CERTIFICATE_TEMPLATE_ID.
3. Direct both streams to same Database connector.
4. First perform "delete" and subsequently "inserts", followed by a single commit.
Do see section titled "Local transactions with database connectors" in Guaranteed delivery with InfoSphere DataStage.
This requirements can be supported with multiple designs. Some of which are:
1) Local transactions with database connectors.
2) Server job, utilizing option of single commit.
3) External PL/SQL Package.
4) Distributed Transaction Stage.
Depending on business need, I have seen #1 thru #3 being implemented. I have never tried #4.
My preferred approach for a "reasonable" amount of incoming data is #1:
1. Added a transformer before the Database stage to copy incoming data stream to two output streams (one for insert and the other one for delete).
2. Remove duplicates from "delete stream" based on the key CERTIFICATE_TEMPLATE_ID.
3. Direct both streams to same Database connector.
4. First perform "delete" and subsequently "inserts", followed by a single commit.
Do see section titled "Local transactions with database connectors" in Guaranteed delivery with InfoSphere DataStage.
-
- Participant
- Posts: 13
- Joined: Thu May 28, 2015 3:59 am
- Location: Pune
-
- Premium Member
- Posts: 1735
- Joined: Thu Mar 01, 2007 5:44 am
- Location: Troy, MI
-
- Participant
- Posts: 13
- Joined: Thu May 28, 2015 3:59 am
- Location: Pune
Re: ODBC - delete then insert
Hi,
Now i am using only write mode = insert.
In ODBC or Oracle we have one functionality like
"Run before and after SQL statement"
in in am using "BEFORE SQL STATEMENT"
Insert command ->
INSERT INTO "O2K"."CERT_TEMPLATE_BINDING"("CERTIFICATE_TEMPLATE_ID","TEMPLATE_VARIABLE","FIELD_NAME") VALUES(ORCHESTRATE.CERTIFICATE_TEMPLATE_ID,ORCHESTRATE.TEMPLATE_VARIABLE,ORCHESTRATE.FIELD_NAME)
Before SQL Statement command ->
delete from O2K.CERT_TEMPLATE_BINDING where CERTIFICATE_TEMPLATE_ID=#FILENAME#
Now i am using only write mode = insert.
In ODBC or Oracle we have one functionality like
"Run before and after SQL statement"
in in am using "BEFORE SQL STATEMENT"
Insert command ->
INSERT INTO "O2K"."CERT_TEMPLATE_BINDING"("CERTIFICATE_TEMPLATE_ID","TEMPLATE_VARIABLE","FIELD_NAME") VALUES(ORCHESTRATE.CERTIFICATE_TEMPLATE_ID,ORCHESTRATE.TEMPLATE_VARIABLE,ORCHESTRATE.FIELD_NAME)
Before SQL Statement command ->
delete from O2K.CERT_TEMPLATE_BINDING where CERTIFICATE_TEMPLATE_ID=#FILENAME#
Thanks and Regards,
Vinay Sharma
Sr. BI Consultant
Cell:(IND) 952-766-1393
Vinay Sharma
Sr. BI Consultant
Cell:(IND) 952-766-1393
Re: ODBC - delete then insert
It seems that all records in incoming datafile will be for a single CERTIFICATE_TEMPLATE_ID. For future reference ... it is helpful to mention such key facts early on.vinu1103sharma wrote:Before SQL Statement command ->
delete from O2K.CERT_TEMPLATE_BINDING where CERTIFICATE_TEMPLATE_ID=#FILENAME#