Always-on job using MQ stages

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

thebird
Participant
Posts: 254
Joined: Thu Jan 06, 2005 12:11 am
Location: India
Contact:

Always-on job using MQ stages

Post by thebird »

Hi,

Have been trying my hands on building an always-on-job using the MQ stages. When a transformer is placed between the Xml input stage and the final MQ stage, the job doesnt write to the queue, in the always on mode. If the properties in the MQ stage is changed to make the job run for a specific time limit or a message limit, then this works fine. Not sure if this has anything to do with the end-of-wave - which has been discussed quite a lot in this site.

I need the Transformer to carry out a couple of look-ups, before writing a message to the output queue.

My current job design -

MQ---->XMLInput------>Trx (with 3 Database lookups)-------->MQ

Eventually I would have to write an XML message into the out Queue.

a) what could be the reason for this to happen so?
b) how can I get this to work?

Would have preferred to have this done in a Parallel job, since there is a Parallel Shared Container that does all the look-ups. Going through this forum and Ernie's post - realised that a parallel job might not be feasible for an always-on scenario - because of the end-of-wave issue, so I am having to re-build the look-up logic in server.

c) is there a way to manually create an end-of-wave, in 7.x?
d) how can i do that? any help?
e) can i generate an end-of-wave from the MQ source queue? or from the application pushing data into the queue - which becomes the source for the DataStage bit?

Any help on this would be good to have.

Thanks!

Aneesh
Last edited by thebird on Mon May 10, 2010 11:01 pm, edited 1 time in total.
------------------
Aneesh
thebird
Participant
Posts: 254
Joined: Thu Jan 06, 2005 12:11 am
Location: India
Contact:

MQ---->XMLInput------>Trx -------->MQ

Post by thebird »

It is the same case, even if the database lookups are removed, and the design is -

MQ---->XMLInput------>Trx -------->MQ.

When the Trasnformer is removed, the messages are being written to the output queue. But when it is placed, the job doesnt write anything.

Thanks!
------------------
Aneesh
eostic
Premium Member
Premium Member
Posts: 3838
Joined: Mon Oct 17, 2005 9:34 am

Post by eostic »

Hi Aneesh...

I use a transformer all the time in my Server jobs...never have seen an end of wave issue caused "just" by the Transformer....... but there are other things to consider:

a) how are you detecting that the messages are written (or not)?
b) are you ensuring that you are writing one message per row at the target?
c) have you compared other target types (commit = 1 with an rdbms target is a good test also, as an alternative)
d) what is in your transformer?

There is no way to automatically generate an end of wave in Server 7.x except for having this in an RTI job (where end of wave was first developed years ago).

Ernie
Ernie Ostic

blogit!
<a href="https://dsrealtime.wordpress.com/2015/0 ... ere/">Open IGC is Here!</a>
eostic
Premium Member
Premium Member
Posts: 3838
Joined: Mon Oct 17, 2005 9:34 am

Post by eostic »

...by the way... xmlOutput is (well...it can be, if you choose aggregate) a blocking Stage, as is the Aggregator, and (usually) Sort. Those are usually the biggest blockers in an end of wave scenario. ....so your original premise may be a concern here. ....even if you resolve the Transformer problem.

Let's also start talking about your volume of rows and what other alternatives you can consider.....

Ernie
Ernie Ostic

blogit!
<a href="https://dsrealtime.wordpress.com/2015/0 ... ere/">Open IGC is Here!</a>
thebird
Participant
Posts: 254
Joined: Thu Jan 06, 2005 12:11 am
Location: India
Contact:

Post by thebird »

Thanks for your time Ernie!

Thats what I thought! Transformers have been in use for long time and a XMP input+Transformer seemed like quite a normal and regular requirement for an always-on job (any job for that matter). But eow was the only issue I could think off, why messages were not getting written out. Well its good to know that it is not the case - and that there is a way forward.

Job design -

MQ---->XMLInput------>Trx -------->MQ

a) I am checking if messages are being written out - by looking in the out queue directly. Didn't think that there should be a problem with this as is the case when trying to check out using Sequential files. Is that not a fair assumption? Because when I remove the Transformer and run the job the messages indeed go to the out queue, with the job running indefinitely. I will also try with a Database, with commit frequency set to 1 row - as has been suggested here many a times before.

b) The "Rows per Message" property in the target Queue stage is set to 1 and the Rows per transaction to zero. I believe this would ensure that 1 message is written to the target each time.

c) Havent tried the RDMS yet. Will check this out and update here.

d) For now - the transformer is a straight map through - i.e, only propagating the fields coming out of the XML Input to the out queue. But eventually the transformer would end up doing look-ups to 3-4 tables and fetch the reference values from the database and pass this on to the out queue. The final output need to is to have only the reference values/fields. No other logic involved.

Regarding the caution that you put out for the XML Out stage - I was and is still am under the impression that if the option is set to "Single Row" instead of "Aggregate" in the XML output stage - there shouldnt be an issue with the EOW. Now, is this an incorrect assumption?

In a real-time, always-on job - when would I want to set the XML Output property to "Aggregate"? Not getting this.

Appreciate you help.

Aneesh
------------------
Aneesh
eostic
Premium Member
Premium Member
Posts: 3838
Joined: Mon Oct 17, 2005 9:34 am

Post by eostic »

Well...first of all, for future purposes, xmlInput and Transformer are not going to block end of wave.....only xml OUTPUT stage is the concern. Something else is going on...

Set everything to 1..... rows per message, rows per transaction...you want everything if possible to be "1".

Question --- is your incoming xml content producing also only one row on the output link from xmlInput Stage?

If you use single row, then yes, I would hope that the xmlOutput will not block.....however, for future purposes (when you have 8.x and end of wave available), you will find that it is very common to need aggregate --- imagine if one of your lookups is designed to return multiple rows....

Ernie
Ernie Ostic

blogit!
<a href="https://dsrealtime.wordpress.com/2015/0 ... ere/">Open IGC is Here!</a>
thebird
Participant
Posts: 254
Joined: Thu Jan 06, 2005 12:11 am
Location: India
Contact:

Post by thebird »

Ernie,

Set everything to 1 - rows per message and rows per transaction.
Still the same result - i.e, no output seen in the queue.

If I keep another link from the XML input straight to another Queue - the output from the XML input stage is written to this queue - but still nothing shows up in the queue which is connected to the Transformer stage.

To answer your question - yes, the incoming xml is producing only 1 row on the output of the XML input stage.

The way I am trying is by manually putting the xml message on the input queue through QPasa. The XML message should get tabularised into 14 columns in the XML input stage (which is happening) and then these 14 columns written to the out queue.

Aneesh
------------------
Aneesh
thebird
Participant
Posts: 254
Joined: Thu Jan 06, 2005 12:11 am
Location: India
Contact:

Post by thebird »

Also tried out with an Oracle database at the target with commit frequency to 1 - nothing was written into table in the always-on mode.

When the job is run with message limit set to 1, the record was written into the out queue as well as into the database.

Still not getting a clue as to what is happening!!! Trying to dig in...
------------------
Aneesh
eostic
Premium Member
Premium Member
Posts: 3838
Joined: Mon Oct 17, 2005 9:34 am

Post by eostic »

Yeah...let me know what you find out. I _always_ have a transformer in the middle of my jobs... I do it just as a safety gap in case I need to alter a datatype or change something (inevitably, I always need it).

What happens if you have only a Transformer? Maybe it's just somethinig odd in combination with the xmlInput and the Transformer......

Ernie
Ernie Ostic

blogit!
<a href="https://dsrealtime.wordpress.com/2015/0 ... ere/">Open IGC is Here!</a>
thebird
Participant
Posts: 254
Joined: Thu Jan 06, 2005 12:11 am
Location: India
Contact:

Post by thebird »

If it is only a transformer it works fine and writes the message out. The issue seems to be the transformer in combination with the XML Input stage , as you mentioned.
------------------
Aneesh
thebird
Participant
Posts: 254
Joined: Thu Jan 06, 2005 12:11 am
Location: India
Contact:

Post by thebird »

I have another question... more because of a post that talked of an IBM patch causing an issue - almost similar...

The Message Limit in the MQ stage if set to 5 - and suppose the queue has only 3 messages, is it supposed to wait till there are 5 messages in the queue to write all of them together into the output queue? This is how it seems to be happening here.. If I run it say with a message limit of 5, nothing is seen in the output queue until 5 messages come in and the job finishes successfully. Was wondering if this is a normal behaviour.

Any help?

Thanks!
------------------
Aneesh
thebird
Participant
Posts: 254
Joined: Thu Jan 06, 2005 12:11 am
Location: India
Contact:

Post by thebird »

Not sure if i am missing out anything here.

My XML message -
<?xml version="1.0" encoding="UTF-8"?><root><title>My First XML Document</title></root>
XML Xpath for table definitions - only 2 fields defined in XML Output.
/root
/root/title/text() -------- Set as repeating Element
Transformation Setting in XML Input stage -

Replace Nulls with empty values (only option checked. Remaning all others are unchecked).

I tried running the job on a different Server as well - but still get the same issue, of the message not being written to the out queue - when job is set to always-on. But the incoming messages definitely are being read, as the moment i put a message on the source queue, it disappears (have selected Destructive Read in the source MQ stage).

Thinking that it is a problem with the XML - Transformer combination issue, I also tried putting in a Link Partitioner stage as a place holder between the XML Input and the Transformer stage - but again I see the same scenario happening.

Thanks!

Now I am not sure how to proceed further.
------------------
Aneesh
thebird
Participant
Posts: 254
Joined: Thu Jan 06, 2005 12:11 am
Location: India
Contact:

Post by thebird »

Also tried these designs-

1.
MQ------>XMLInput---------->Trx-------->Oracle
Same result, with no data being written into the database. Had set options in the MQ stage -
- Destructive Read
- Commit/Backout only once at the end of the job

As soon as a message comes into the queue, it disappears (which I believe indicates that it has been read by DataStage). Once the job is forcefully aborted (since there was nothing being written to the target database) I would have expected the messages to appear in the input queue since they had not been committed yet. But this was not so - those messages are no more there. Also noticed 2 things -

a) Once the job has been aborted. the performance statistics shows that rows were read from the queue and sent to the target DB.
b) There are phantom process for the Transformer and XML Input stages left over, because of which even after the job has been aborted, any message that comes into the queue - disappears.

2.
MQ------>XMLInput------>Link Collector--------->Trx-------->Oracle
The intention was to separate the XML input and Transformer via a place holder stage. Result the same.

Still trying various options and digging in further, but apprehensive that I'll reach a dead end soon. :(

Aneesh
------------------
Aneesh
eostic
Premium Member
Premium Member
Posts: 3838
Joined: Mon Oct 17, 2005 9:34 am

Post by eostic »

I applaud your efforts in digging deeper into it. I haven't had a chance to test any scenarios and unfortunately, am several releases ahead in my environment in both DS and MQ. I have done this in the past, though, so it is strange, but there may be a release dependency that I hadn't run into.

The message limit 5 is just that....it says keep reading until you get 5 messages....if you get only 3, it will wait forever.

I'll keep digging around. How big is your xml?

Ernie
Ernie Ostic

blogit!
<a href="https://dsrealtime.wordpress.com/2015/0 ... ere/">Open IGC is Here!</a>
eostic
Premium Member
Premium Member
Posts: 3838
Joined: Mon Oct 17, 2005 9:34 am

Post by eostic »

...another thought..... Have you worked at all with your interprocess row buffering settings? What are they for this job? For this project?

Ernie
Ernie Ostic

blogit!
<a href="https://dsrealtime.wordpress.com/2015/0 ... ere/">Open IGC is Here!</a>
Post Reply