Search found 500 matches

by ag_ram
Mon Mar 24, 2008 7:43 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: DataStage Server Engine(UniVerse)
Replies: 5
Views: 6570

Thanks Ray for your immediate response. Here i need your suggestion, 3. The UniVerse-based technology does not enforce the same number of columns in every record. In addition, it supports nested structures, which made UniVerse an ideal choice for storing DataStage design objects (not just job object...
by ag_ram
Mon Mar 24, 2008 6:20 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: DataStage Server Engine(UniVerse)
Replies: 5
Views: 6570

Thank you guys for your witty responses. And here i pose a few more questions over your answers. 1. Is there anything like DataStage Server Engine Architecture Diagram so as to illustrate the Interfacing between DataStage GUI and UniVerse with the flow of BASIC Language. 2. Ray, RT_CONFIGnn contains...
by ag_ram
Mon Mar 24, 2008 2:50 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: DataStage Server Engine(UniVerse)
Replies: 5
Views: 6570

DataStage Server Engine(UniVerse)

Hello Everybody, I have a few questions about DataStage Server Engine(UniVerse). Questions: 1. How internally DataStage Server engine functions in the respect of UniVerse, Basic and DataStage? How these components are interacting with each other when a Server Job is designed and executed? 2. What ar...
by ag_ram
Thu Mar 13, 2008 12:58 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Performace Issue
Replies: 5
Views: 2026

It is not ginving any error message. Our concern is slowness. Job is working fine. We want the job to process faster. I cant increase transaction count as it is message by message processing. It will create the duplicates at end if i do process in batch mode.
by ag_ram
Wed Mar 12, 2008 4:44 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Performace Issue
Replies: 5
Views: 2026

Performace Issue

My job is designed as below. MQ --> XML input stage --> transfrmer --> funnel --> XML output --> transformer --> MQ We get around 2million records into the queue almost in less time. Our job not able to consume fast enough and MQ getting overloaded. Finally it job got aborted. When I checked the pro...
by ag_ram
Tue Mar 11, 2008 1:22 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: MQ related Job - Lost one message in every run
Replies: 3
Views: 1306

Its because of european character in data. I made it none in encoding format furing XML generation. Now issue is resolved. thanks
by ag_ram
Wed Mar 05, 2008 9:06 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: MQ related Job - Lost one message in every run
Replies: 3
Views: 1306

MQ related Job - Lost one message in every run

I have a real time job where MQ stages are source and target. Job structure: [img]MQ%20---%20XML%20input%20stage%20----%20transformer%20---%20XML%20output%20stage%20----%20transformer%20stage%20---%20MQ[/img] Source has 19000 records and in every run it posts 18999 records. XML output stage. MQ sour...
by ag_ram
Mon Mar 03, 2008 5:57 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: MQ with high load
Replies: 3
Views: 919

I can find one way as setting property "transaction record count" for target MQ stage as 0. By this means I can assure all the source records will be processed if job succeeds otherwise nothing. Is it right approach?
by ag_ram
Mon Mar 03, 2008 5:52 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: MQ with high load
Replies: 3
Views: 919

Hi Ray,
My source is dataset and target is queue. If job got aborted at 5lakh record level, at current settings of queue 5lakh messages are available.After reset I want job should start processing other data rather then all data available in source dataset.
by ag_ram
Mon Mar 03, 2008 4:09 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: MQ with high load
Replies: 3
Views: 919

MQ with high load

I have a job with one dataset and MQ stage. Source dataset has 18lakh XML messages. Before job aborted there was 5lakh messages. Now after resetting the job, I want job should start processing from next messages. But job is now start processing from message one. This way we end up with duplicates in...
by ag_ram
Fri Feb 22, 2008 3:22 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Link Count for always run job
Replies: 6
Views: 1935

Yes, but if you have an output file created every 5 minutes then you only lose the last 5 minutes worth of statistics. If you need every row accounted then you have to implement other changes. If you continuously update the same file then you have to deal with operating system buffering and/or excl...
by ag_ram
Fri Feb 22, 2008 2:14 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Link Count for always run job
Replies: 6
Views: 1935

Write a DS function that outputs to a Sequential file the metrics you desire. Consider using a Common block to hold row counts as the job continously runs. Then, every hour output a file containing metrics for that hour. You can easily gather up the metrics for the day that way. The reason I would ...
by ag_ram
Fri Feb 22, 2008 12:59 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Link Count for always run job
Replies: 6
Views: 1935

Link Count for always run job

I have job which has source and target as MQ. This job runs continuously 24*7. As the job never ends, we are not able to find the link count for the job as dsjob command returns 0. But I have a requirement where we need to find the number of records loaded in a single day. Can anyone give suggest on...
by ag_ram
Fri Feb 15, 2008 11:59 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Fatal:Transfer doesnot have the output interface destination
Replies: 5
Views: 972

I have recompiled and tested it. Now no error. But not able to find the real cause.