RTI passive stage Error - SOAP over HTTP

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
Kirtikumar
Participant
Posts: 437
Joined: Fri Oct 15, 2004 6:13 am
Location: Pune, India

RTI passive stage Error - SOAP over HTTP

Post by Kirtikumar »

hello,
i am working RTI with SOAP over HTTP
The issue that i am facing is - I am creating hash file from XML input in RTI job and next i want to use the same hash files for some lookups, but it is giving compile time error that "input and output links for passive stage not permitted in RTI ".

The previous design was something like(DB2 is driving link):

Code: Select all

                                                 db2
                                                   |
                                                   |
RTIin--->XMLIn--->Tranformer---->hashfile------->Tranformer---->XMLout

so I split the design as below, this removed the error.
In the design hashfile name of both files is obviously the same. But i am not sure how the split jobs in single window/palette are executed i.e. their order of execution.

Code: Select all

RTIin--->XMLIn--->Tranformer---->hashfile

                                                                 db2 
                                                                  |
                                                                  |
                                              hashfile------->Tranformer------>XMLout
I am making XML parsed data as lookup and not the DB2 bcoz, the XML input has only one row and i have to get all matching rows for that single input row from DB2 database table.


Thank You......
Kirti
mleroux
Participant
Posts: 81
Joined: Wed Jul 14, 2004 3:18 am
Location: Johannesburg, South Africa
Contact:

Post by mleroux »

I haven't worked with Real-Time Integration (RTI) services myself, but the error that DataStage gives does make sense, since RTI is designed to deal with a row immediately as it comes in, i.e. there's not supposed to be 'normal' batch ETL processing with RTI.

IMHO, allowing a passive stage such as a hashed file in RTI would either:

1) Create a bottleneck and destroy the RTI concept by batching everything together before writing it to the output link(s) or;

2) Make the passive stage useless, since the row would go in and come straight out again.

Does this make sense?
Morney le Roux

There are only 10 kinds of people: Those who understand binary and those who don't.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

A couple of points:

There is a forum specific to RTI/SOA, so you'd be better off posting there.

I don't have any direct experience with RTI, but from what I've seen simply having a hash file in a job wouldn't be a problem. Building one, using the traditional server/batch model, would however. That hash file build should be split out into a separate job, otherwise I'm pretty sure it will be rebuilt for every row that comes through the job. I doubt that's what you had in mind.

To answer your question regarding the execution order of your changed job, the two 'pieces' would run in parallel.
-craig

"You can never have too many knives" -- Logan Nine Fingers
Post Reply