Message Queue

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
harryhome
Participant
Posts: 112
Joined: Wed Oct 18, 2006 7:10 am

Message Queue

Post by harryhome »

Hi

here is the scenario, i have a job which will run 2 times a day, which is called in the sequence.

This job takes data from message queue and other job process it.

Now , I want only first line from queue should be processed whcih every run.

If there are two lines of message then 2nd message should be processed in next run.

Plz guide me
clarcombe
Premium Member
Premium Member
Posts: 515
Joined: Wed Jun 08, 2005 9:54 am
Location: Europe

Post by clarcombe »

I assume that you are talking about something like MQ series or MSMQ.

If this is the case, why don't you set up a trigger on the queue which calls a shell/dos script which calls your datastage job.( dsjob) The trigger should be defined to be executed on reception of each message

Your datastage job should be able to be invoked multiple times and you should pass an invocation id to this. I believe this is possible.

You will need to check the dsjob properties first.
Colin Larcombe
-------------------

Certified IBM Infosphere Datastage Developer
JoshGeorge
Participant
Posts: 612
Joined: Thu May 03, 2007 4:59 am
Location: Melbourne

Post by JoshGeorge »

You mean to say 2 messages in MQ? In that case it is easy, set the limit in MQ to read one message and that will read the first message (FIFO). You will come out of your first job and call the second job to process this message. Second time you run your sequence it will process the latest message.
Last edited by JoshGeorge on Wed May 16, 2007 3:15 am, edited 1 time in total.
Joshy George
<a href="http://www.linkedin.com/in/joshygeorge1" ><img src="http://www.linkedin.com/img/webpromo/bt ... _80x15.gif" width="80" height="15" border="0"></a>
harryhome
Participant
Posts: 112
Joined: Wed Oct 18, 2006 7:10 am

Post by harryhome »

Hi thanks for reply. but script thing looks out of my reach/access.


here is , where i have reached.......


In Ascential datastage Websphere MQ stage, properties, Under GENERAL TAB, i have set Message Limit as 1 and End of Data Message as 0.

This is Destructive read.

Now when more than one row is there on MQ , the job is reading only the first row.
But Its also removing all records from queue, ideally it should remove only first record bcz its reading only first record.



plz guide
JoshGeorge
Participant
Posts: 612
Joined: Thu May 03, 2007 4:59 am
Location: Melbourne

Post by JoshGeorge »

It destructs only what it have read. (See options for Destructive Read in that tab). For testing do a non-destructive read and see if this is still happening. If any other applications/programs are reading from the same MQ, you might want to check that also.
Joshy George
<a href="http://www.linkedin.com/in/joshygeorge1" ><img src="http://www.linkedin.com/img/webpromo/bt ... _80x15.gif" width="80" height="15" border="0"></a>
harryhome
Participant
Posts: 112
Joined: Wed Oct 18, 2006 7:10 am

Post by harryhome »

I have tried with non destructive queue ,

here its only reading first row and not the second one , which is right.


Now for destructive queue, with Message Limit as 1 and End of Data Message as 0.
I thinks it reads all data from queue (and hence removes all), but only allows first row to proceed.

Now i am not getting any way to restrict it to read only first.

Is 'End of Data Message as 0 ' has do do anything with this
eostic
Premium Member
Premium Member
Posts: 3838
Joined: Mon Oct 17, 2005 9:34 am

Post by eostic »

If you use destructive read, and only want one message, then that's all it will read before terminating the job. I suppose it's possible that there is a conflict with the end of message type indicator, although I have never seen a problem there. Put in a large integer for that value (something like 99899)...that property says "if I see a formal MQ Message Type with that value, the job will terminate".

Your job should simply read one message and then end. Did you create the queue and the message going into it? See if it, and its messages, have expiry information. That may explain why you lose all messages on a single read.

Ernie
harryhome
Participant
Posts: 112
Joined: Wed Oct 18, 2006 7:10 am

Post by harryhome »

It was giving the same problem and removing all data from queue after reading only one row.

Following are present in MQ stage

wait time 2
MessageLimit 1
End Of Data Message 99899


And

Destructive Read
Commit/BackOut only once at the end of job
Do data conversion on the MQGET call

These options are clicked.



Also , I am putting data on queue , where for testing i am putting two rows , after first row there is ENTER , and next row is on next line.
JoshGeorge
Participant
Posts: 612
Joined: Thu May 03, 2007 4:59 am
Location: Melbourne

Post by JoshGeorge »

Any specific reason why you use 'End Of Data Message 99899' . Try this with zero itself and check 'Ignore end of record' and uncheck 'Do data conversion on the MQGET call' and filter for datagram messages.
Joshy George
<a href="http://www.linkedin.com/in/joshygeorge1" ><img src="http://www.linkedin.com/img/webpromo/bt ... _80x15.gif" width="80" height="15" border="0"></a>
harryhome
Participant
Posts: 112
Joined: Wed Oct 18, 2006 7:10 am

Post by harryhome »

thanks for reply.

i have tried with ur combinations but its still not giving desired output.

its removing all data from queue.
JoshGeorge
Participant
Posts: 612
Joined: Thu May 03, 2007 4:59 am
Location: Melbourne

Post by JoshGeorge »

Pls. post your job design here
Joshy George
<a href="http://www.linkedin.com/in/joshygeorge1" ><img src="http://www.linkedin.com/img/webpromo/bt ... _80x15.gif" width="80" height="15" border="0"></a>
harryhome
Participant
Posts: 112
Joined: Wed Oct 18, 2006 7:10 am

Post by harryhome »

Hi here is the solution/mistake that i found,

Actually while loading data into queue through sequencial file , i was taking all row in same file and loading all rows into queue at a time.

But now i am loading each row seperatelly on queue and code is working properly.

Thanks for help , i ll be curious to know how to find the if all rows are loaded at a time on queue......
JoshGeorge
Participant
Posts: 612
Joined: Thu May 03, 2007 4:59 am
Location: Melbourne

Post by JoshGeorge »

harryhome wrote:i ll be curious to know how to find the if all rows are loaded at a time on queue......
Same will happen again. :) While doing a destructive read, it will remove all data from queue.
Joshy George
<a href="http://www.linkedin.com/in/joshygeorge1" ><img src="http://www.linkedin.com/img/webpromo/bt ... _80x15.gif" width="80" height="15" border="0"></a>
eostic
Premium Member
Premium Member
Posts: 3838
Joined: Mon Oct 17, 2005 9:34 am

Post by eostic »

It would also help immensely for you to have direct access to the queues OUTSIDE of DataStage. Then it would have been more obvious. You need to be able to see (usually via the MQ Explorer) while testing what the queue depth is, the expiry time on messages placed into the queue, the ability to create your own queues and queue managers as needed, etc. etc.

DataStage can only ever read one message at a time from the queue. It may create more than one "row" from that message, but it's only going to read "one" message each time the plugin is "entered" at runtime.

On the target side, the same thing is true...but you can ask DataStage to "collect" a set of rows before doing the MQPUT.

As for the 99899 that is a very elegant way of terminating MQ jobs. Instead of "per row" as you are doing here, you may have an application that you'd like to terminate programatically, or otherwise leave running "forever." By using the 99899 (or any other large integer value of your choice) you can have another application (or DataStage) send a message to the queue you are reading, giving that message a formal "Message Type" with that integer value. When DataStage reads that message and sees the matching Message Type, it will have the DS Job shut down gracefully --- at your control.

Ernie
Post Reply