Rules in transformer for deleting rows
Moderators: chulett, rschirm, roy
-
- Premium Member
- Posts: 12
- Joined: Thu Dec 18, 2008 8:55 am
Rules in transformer for deleting rows
Hello,
I have 2 stages (each one is retrieving data from a different table in a different Oracle 10g Database).
Stage 1 : Database A
Stage 2 : Database B
My goal is to remove invoice number in Database A if invoice number has already been deleted in Database B.
I have tried to implement some rules in my transformer, but I did not have any success till now.
Any hints are most welcome.
Thanks,
Nick
I have 2 stages (each one is retrieving data from a different table in a different Oracle 10g Database).
Stage 1 : Database A
Stage 2 : Database B
My goal is to remove invoice number in Database A if invoice number has already been deleted in Database B.
I have tried to implement some rules in my transformer, but I did not have any success till now.
Any hints are most welcome.
Thanks,
Nick
OK... does 'deleted in Database B' mean actually gone / deleted / no longer there or is it more of a logical deletion by setting a flag? Assuming the former, worst case you could build a reference hashed file of B invoices and then stream them in from A, sending a delete transaction back to A if the lookup fails.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
-
- Premium Member
- Posts: 12
- Joined: Thu Dec 18, 2008 8:55 am
-
- Premium Member
- Posts: 12
- Joined: Thu Dec 18, 2008 8:55 am
Not really sure how I could elaborate much on it. Build a hashed file with all of the invoices from B. Stream in invoices from A and do a lookup to the B hashed file, when your lookup fails you need to delete the A invoice. I'd probably do the actual deletion in a separate step by landing the keys to delete first, rather than deleting from the same table you are sourcing from, which could have... issues.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
-
- Premium Member
- Posts: 12
- Joined: Thu Dec 18, 2008 8:55 am
Have you not worked with hashed files yet? They are the heart and sould of Server jobs so something to become very, very familiar with if you haven't already.
Probably easiest to do this in two jobs. First one that just creates the hashed file and looks something like:
OCI -> Transformer -> Hashed File
That one sources from B. Then a second job sources from A and in the transformer does a reference lookup to the above hashed file.
Probably easiest to do this in two jobs. First one that just creates the hashed file and looks something like:
OCI -> Transformer -> Hashed File
That one sources from B. Then a second job sources from A and in the transformer does a reference lookup to the above hashed file.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers