moving data to Mainframe via MQ - not all data getting there

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
smeliot
Participant
Posts: 19
Joined: Sun Mar 18, 2007 7:31 pm

moving data to Mainframe via MQ - not all data getting there

Post by smeliot »

Moving data (CFF defined) to mainframe queue. It's getting there, but only the first field (2 bytes) is showing up in the queue. All 4 records are there, but only the first field of them.

Options used in MQ Stage:
Server Mode
Context mode: Set all
Cluster queue: No
Dynamic queue: No
Message write mode: Create
Record count: 0

Thanks in advance.
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Have you tried using the MQ Connector stage? We are using MQ from mainframe with very long and variable message sizes with no issues.
smeliot
Participant
Posts: 19
Joined: Sun Mar 18, 2007 7:31 pm

Post by smeliot »

ArndW wrote:Have you tried using the MQ Connector stage? We are using MQ from mainframe with very long and variable message sizes with no issues. ...
That's what we're using - CFF Stage to MQ Connector Stage. We are not receiving messages, we're sending.

Thanks.
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Sorry, my mistake.

Don't you just have one data output column called "Payload", which comprises all the column data, to which you write? The template for this is in Table Definitions -> Real Time -> WebSphere MQ Connector -> MQMessage.
smeliot
Participant
Posts: 19
Joined: Sun Mar 18, 2007 7:31 pm

Post by smeliot »

ArndW wrote:Sorry, my mistake.

Don't you just have one data output column called "Payload", which comprises all the column data, to which you write? The template for this is in Table Definitions -> Real Tim ...
Well, unfortunately, I can't see your whole message. I'll make a guess - and look for the table def. However, since we have 27 fields in our input, would we now need a transform between CFF and MQ that concatenates all the 27 fields into 1 output field (the Payload) ? or should I redefine my input file to be just one field and use that for this job ?

Thanks.
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Yes, you will need to concatenate the fields to one; you can use a transform stage or a column export stage to effect this.
smeliot
Participant
Posts: 19
Joined: Sun Mar 18, 2007 7:31 pm

Post by smeliot »

ArndW wrote:Yes, you will need to concatenate the fields to one; you can use a transform stage or a column export stage to effect this. ...
Okay - I've concatenated a couple of different ways - and was able to send to the Queue. However, only 72 of the 775 bytes of the now one field was received (again, all 4 records were sent and received). I'm thinking that I should be specifying something in the Header instead of taking all defaults. I'm investigating that now, but if anyone has any revelations, I'll gladly listen. :D

Thanks
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

The "Maximum Message Length" is a Queue setup parameter - perhaps that is set too small in your queue?
smeliot
Participant
Posts: 19
Joined: Sun Mar 18, 2007 7:31 pm

Post by smeliot »

ArndW wrote:The "Maximum Message Length" is a Queue setup parameter - perhaps that is set too small in your queue? ...
Finally figured it out. I had to lie to DS and tell it that the file that I created in one job with CFF (with 127 fields, about 1/3 of them defined as COMP-3) was actually a Sequential File, with 1 field which is binary - length of 775. Then things work.

Thanks!
Post Reply