Page 2 of 2

Posted: Fri Dec 15, 2006 7:18 am
by DSguru2B
DSguru2B wrote:If you get any key violation, the warning will be created. Thats why you need to explicitly handle it by the method devised, and capture the duplicates in a seperate file. This way you dont send any duplicates.
Says who. I did, right there :P

thurmy34:
Go to Sequential file stage, go to properties, on the general tab, check the box that says 'Stage uses filter commands', then switch to the outputs tab there you will find the Filter Command Box active. You can specify OS level commands there.

Posted: Fri Dec 15, 2006 7:31 am
by shrek7577
Yes, I use this constraints :

Doublon file:
(table.REJECTED and table.DBMSCODE = 'ORA-00001' )

Filter file :
(table.REJECTED and table.DBMSCODE <> 'ORA-00001' )

So i can catch every error from table.
But, DS generate some also warning in the log (in Director).
Here, some routine use this to generate log synthesis and BUG, every warning will stop the global batch...

Unfortunately, I can't modify the framework but have permissionto skip the table warning (the batch must go on and all the warnings will be treated later)

:idea:

Posted: Fri Dec 15, 2006 8:31 am
by chulett
You don't want to catch every error, you want to avoid generating any errors. There's no reason on the planet for a Server job to ever generate any warnings. You should be checking for uniqueness before sending anything to your database and either turning that operation into an update or logically discarding the record via your constraints - not putting the burden on your DB and then catching anything it pukes back up. :P

This is perfectly 'normal' ETL processing and is easily accomplished with hashed file lookups leveraging the natural keys in your input data compared to your target.

Posted: Fri Dec 15, 2006 9:31 am
by thurmy34
It was right in front of me

Sorry for his useless post.

Thank You DSguru2B.

Posted: Fri Dec 15, 2006 3:10 pm
by ray.wurlod
The following design will help. HashedFileStage1 and HashedFileStage2 refer to the same hashed file.

Code: Select all

           HashedFileStage1   HashedFileStage2
                       |              ^
                       |              |(not found)
                       V              |
              ----->   TransformerStage  ------------> SequentialFile
                                            (found)
The hashed file will contain the first occurrence of each key value, the sequential file will contain all the duplicates. In HashedFileStage1 set read cache to "disabled, lock for updates" and in HashedFileStage2 do not enable write cache.

Posted: Tue Dec 19, 2006 9:10 pm
by deepak.shanthamurthy
Hi
am trying to capture duplicate records
i have sorted the records on the key
and am using the following code as stage variables.

Var = Key
flag = If Var = Var2 Then "Duplicate" Esle "NO"
Var2 = Var

But this is not giving me the results

Though i see in my debugger that var2 is retaining the old value of var the flag is always being set to Duplicate!!!!!

not sure where the problem is ???

Any help is appreciated

Thanks

Posted: Tue Dec 19, 2006 9:33 pm
by I_Server_Whale
You cannot hijack a thread like this. You need to start a new thread/post.

Whale.

Posted: Wed Dec 20, 2006 12:12 am
by deepak.shanthamurthy
sorry for that...
starting this as a new thread....