Page 1 of 1

RTI - SOAP over HTTP using passive stages

Posted: Wed Nov 10, 2004 9:54 pm
by Kirtikumar
hello,
i am working RTI with SOAP over HTTP
The issue that i am facing is - I am creating hash file from XML input in RTI job and next i want to use the same hash files for some lookups, but it is giving compile time error that "input and output links for passive stage not permitted in RTI ".

The previous design was something like(DB2 is driving link):

Code: Select all


                                                 db2 
                                                   | 
                                                   | 
RTIin--->XMLIn--->Tranformer---->hashfile........>Tranformer---->XMLout 

 
so I split the design as below, this removed the error.
In the design hashfile name of both files is obviously the same. But i am not sure how the split jobs in single window/palette are executed i.e. their order of execution.

Code: Select all


RTIin--->XMLIn--->Tranformer---->hashfile 

                                                                 db2 
                                                                  | 
                                                                  | 
                                              hashfile........>Tranformer------>XMLout 
 
I am making XML parsed data as lookup and not the DB2 bcoz, the XML input has only one row and i have to get all matching rows for that single input row from DB2 database table.


Thank You......
Kirti

Posted: Wed Nov 10, 2004 11:45 pm
by vmcburney
Can you take the hash file out of the design? It makes the real time design a bit messy. It doesn't seem to like having it as a target and source in the one RTI job.

Posted: Thu Nov 11, 2004 5:12 am
by Kirtikumar
Hello,
We will be removing hash files from RTI enabled jobs. The reason is:- For RTI enabled job multiple instances may be active at the same time, so while updating file, multiple instances of same job may try to update the same file and will cause collision. Thus any of the passive stage as hash/seq file may act as bottleneck affecting the performance of jobs, so seq/hash files should never be used.

So now to get multiple rows from lookup, we are trying to use ODBC stage which allows multi-row lookup.

Thanks You
Kirti
vmcburney wrote:Can you take the hash file out of the design? It makes the real time design a bit messy. It doesn't seem to like having it as a target and source in the one RTI job.

Posted: Thu Nov 11, 2004 2:38 pm
by ray.wurlod
If you would prefer to use hashed files (so that they can be used in other, non-RTI-enabled jobs), you can use the UV stage to access them. The UV stage also offers multi-row return.

However, you don't get the memory cache (but, then, you don't get that with ODBC stage either). Another advantage of UV over ODBC is that you don't have the overhead of driver software; connection from a UV stage to hashed files is via a direct mechanism (the BASIC SQL Client Interface, or BCI).

Why not Hash files in RTI jobs?

Posted: Sat Nov 13, 2004 12:00 am
by Kirtikumar
Hi,
I can use UV stage to access the hash files. But the problem with hash files in my jobs is two fold:
1. First I will have to create/update hash files from DB2 stage each time an instance is created from Front End. (Creating/updating hash file each time a instance is created is mendatory as I want updated data from database to be sent to the callers coz some times depending on need, I am also updating the database data.)
2. Use UV stage to access that hash files.

And as hash files are created with some name and according to RTI concepts multiple instances of job may exist at a single time, multiple instances may try to update same hash file and may cause collision affecting the performance of jobs.
I am not much aware of ODBC or UV stages. But when I put them on palette/DS window,both stages need Data Source Name(DSN). So DSN needs to be created for both the stages. Then what is difference in 2 stages? And as of now the Hash files are not needed elsewhere in batch jobs.

Regards,
Kirti
ray.wurlod wrote:If you would prefer to use hashed files (so that they can be used in other, non-RTI-enabled jobs), you can use the UV stage to access them. The UV stage also offers multi-row return.

However, you don't get the memory cache (but, then, you don't get that with ODBC stage either). Another advantage of UV over ODBC is that you don't have the overhead of driver software; connection from a UV stage to hashed files is via a direct mechanism (the BASIC SQL Client Interface, or BCI).

Posted: Sat Nov 13, 2004 10:12 pm
by ray.wurlod
To address your specific points:

1. This will be true no matter which mechanism you choose. If you want to send current information to caller, you must obtain current information from database. However, you can use a shared hashed file (publicly cached) and keep it up to date at the same time you are keeping your database up to date.

2. UV stages access local hashed files via the pre-defined DSN localuv. If the hashed file is in a different account, you need to edit uvodbc.config to identify a different DSN whose database type is UNIVERSE.
The difference between UV and ODBC is that UV accesses only UniVerse tables, and does so without the use of an intermediate driver.

Job Design Problem

Posted: Mon Nov 15, 2004 9:40 pm
by Kirtikumar
Hi Ray,
I agree that Hash Files can be used and provide the performance benifits. But the problem about which I am worrying is As in RTI multiple job instances will be created, each instance will try to update the same hash file and may cause collision.

Another problem with hash files is how I will be creating them in a single job at one point and use it at another point in the same job. As in my jobs I want to compare one row from XMLIn with database and want to extract multiple rows from there. For this previously I created Hash File of this single row and used it as reference link (As mentioned in my first mail) and made DB2 as driving linkBut using passive stages with input and output link is not permitted in RTI jobs. Now if I created Hash Files from db2 and used them in UV stage,the problem is of course of job design.
The design which I can think of is as follows:

Code: Select all



XMLIn-------------->Xformer
                        |
                        |(driving Lnk)
           (ref)        |
UV stg.----------->XFormer------->XMLOut

The UV stage will access the hash file created/updated from database, But where to create/update this hash file. We cant create/update in sepereate job and I cant think of any way of creating it in this job from database.

Thanks & Regards,
Kirti