Search found 196 matches

by sumitgulati
Wed Aug 18, 2004 11:26 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: DRS stage - Reference link with Multiple row result set
Replies: 5
Views: 1433

Yaa, then the only work around possible is to use ODBC or UV stage.
If you are using ODBC stage then go to the transformer properties \ Input tab. Select the reference link from the drop down box and check the option "Reference link with multi row result set".

Regards,
Sumit
by sumitgulati
Wed Aug 18, 2004 10:47 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: DRS stage - Reference link with Multiple row result set
Replies: 5
Views: 1433

What is the exact requirement? In case of multiple match found from the look up you want:
1) All of them to be returned OR
2) First value to be returned OR
3) Last value to be returned

Regards,
Sumit
by sumitgulati
Tue Aug 17, 2004 9:38 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Truncate table then insert rows
Replies: 4
Views: 2107

Thank you Ray.

Yaa I could always notice the Truncate/Delete statements in the log but could never find them in the Job.

Thanks again,
-Sumit
by sumitgulati
Tue Aug 17, 2004 6:27 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Truncate table then insert rows
Replies: 4
Views: 2107

Truncate table then insert rows

Hi, I have a very simple job that reads data from a source table and loads into a target table. The setup looks like this SRC --> TFM --> TGT I am using DRS(Dynamic RDBMS stage) stage for SRC and TGT stages. In target stage I selected Update action "Truncate table then insert rows". Now wh...
by sumitgulati
Tue Aug 17, 2004 11:53 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Reading and writing to a table in same job..
Replies: 10
Views: 3485

Even if the array size of the output link is 1, every record that goes into table A will not go into the hash file immediately. It will wait for the input link execution to get over. --> thanks again sumit..now if i put transaction size as 1...would the problem persists even then ? or is the data ri...
by sumitgulati
Tue Aug 17, 2004 11:41 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Reading and writing to a table in same job..
Replies: 10
Views: 3485

See, this is how your design will work. If array size and transaction size to the table input is 1 every record will immediately get reflected to the reference table. The hash file load will happen only once table A load is completely over. So, just making the array size 1 in the table output link w...
by sumitgulati
Tue Aug 17, 2004 11:18 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Reading and writing to a table in same job..
Replies: 10
Views: 3485

Xan, the design that you gave may work with array size and transaction size as 1 but it will be slow because you are making a lookup to the table. To make a lookup to the hash file you may have to consider the design I proposed. To answer your questions: 1) 1 row is being to the hash file (its alrea...
by sumitgulati
Tue Aug 17, 2004 10:59 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Complex join problem
Replies: 12
Views: 4669

How many key columns did you define while creating the hash file. If it is 8 then you have to give a join for all the 8 key columns. If while creating the hash file you gave 8 key columns and now you are using 6 of them for a join condition you will get very unexpected and random results. What you n...
by sumitgulati
Tue Aug 17, 2004 10:49 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Reading and writing to a table in same job..
Replies: 10
Views: 3485

Making a lookup to hash file will definitely be faster. Well here is what I would suggest TableA --> HashA | TFM --> HashA --> TableA Load every thing from Table A to Hash A in the same job. Then make a lookup to the hash file in a transformer and load the same hash file. Remember that you should ha...
by sumitgulati
Mon Aug 16, 2004 5:34 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Automatically handle activities that fail
Replies: 11
Views: 11697

Kim, If I still have to explicitly use UtilityAbortToLog/UtilityWarnToLog then what is "Automatically handle activities that fail" option for? My understanding says that if we check "Automatically handle activities that fail" in the administrator the while creating the sequencer ...
by sumitgulati
Mon Aug 16, 2004 5:19 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Automatically handle activities that fail
Replies: 11
Views: 11697

If a Server Job aborts, the Sequencer that called this server job still is shown in FINISHED status. Ideally the Sequencer jobs should also abort. I guess thats what "Automatically handle activities that fail" option is meant to do. But it does not seem to be working.

Regards,
Sumit
by sumitgulati
Mon Aug 16, 2004 4:48 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: PeopleSoft EPM ASCL jobs
Replies: 8
Views: 2410

I would also like to joing a PeopleSoft group if it gets created.
by sumitgulati
Mon Aug 16, 2004 3:28 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Automatically handle activities that fail
Replies: 11
Views: 11697

Automatically handle activities that fail

Hi All, I have a Sequencer Job that calls a Server Job. I want the Sequencer Job to abort/throw warnings if the called Server Job aborts/throws warnings. I know one way to implement this is to call a routine Utility????ToLog also from the Sequencer Job. But since I am on version 7.1 I checked the op...
by sumitgulati
Fri Aug 06, 2004 11:23 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Ascential user roles and privileges
Replies: 3
Views: 1346

The users with which we are facing the issue have been assigned to Ascential Designer group.

Regards,
Sumit
by sumitgulati
Fri Aug 06, 2004 11:19 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Run-time error '6'
Replies: 4
Views: 1332

Sorry the ddl didn't come properly. Here is the post again: Thanks for the suggestion guys but the error seems to be something else. No the error is not system specific. I tried it from two different systems. Its gives the same error in both. I went through the ddl of the tables also and it looks ok...