Yaa, then the only work around possible is to use ODBC or UV stage.
If you are using ODBC stage then go to the transformer properties \ Input tab. Select the reference link from the drop down box and check the option "Reference link with multi row result set".
Regards,
Sumit
Search found 196 matches
- Wed Aug 18, 2004 11:26 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: DRS stage - Reference link with Multiple row result set
- Replies: 5
- Views: 1433
- Wed Aug 18, 2004 10:47 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: DRS stage - Reference link with Multiple row result set
- Replies: 5
- Views: 1433
- Tue Aug 17, 2004 9:38 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Truncate table then insert rows
- Replies: 4
- Views: 2107
- Tue Aug 17, 2004 6:27 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Truncate table then insert rows
- Replies: 4
- Views: 2107
Truncate table then insert rows
Hi, I have a very simple job that reads data from a source table and loads into a target table. The setup looks like this SRC --> TFM --> TGT I am using DRS(Dynamic RDBMS stage) stage for SRC and TGT stages. In target stage I selected Update action "Truncate table then insert rows". Now wh...
- Tue Aug 17, 2004 11:53 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Reading and writing to a table in same job..
- Replies: 10
- Views: 3485
Even if the array size of the output link is 1, every record that goes into table A will not go into the hash file immediately. It will wait for the input link execution to get over. --> thanks again sumit..now if i put transaction size as 1...would the problem persists even then ? or is the data ri...
- Tue Aug 17, 2004 11:41 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Reading and writing to a table in same job..
- Replies: 10
- Views: 3485
See, this is how your design will work. If array size and transaction size to the table input is 1 every record will immediately get reflected to the reference table. The hash file load will happen only once table A load is completely over. So, just making the array size 1 in the table output link w...
- Tue Aug 17, 2004 11:18 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Reading and writing to a table in same job..
- Replies: 10
- Views: 3485
Xan, the design that you gave may work with array size and transaction size as 1 but it will be slow because you are making a lookup to the table. To make a lookup to the hash file you may have to consider the design I proposed. To answer your questions: 1) 1 row is being to the hash file (its alrea...
- Tue Aug 17, 2004 10:59 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Complex join problem
- Replies: 12
- Views: 4669
How many key columns did you define while creating the hash file. If it is 8 then you have to give a join for all the 8 key columns. If while creating the hash file you gave 8 key columns and now you are using 6 of them for a join condition you will get very unexpected and random results. What you n...
- Tue Aug 17, 2004 10:49 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Reading and writing to a table in same job..
- Replies: 10
- Views: 3485
Making a lookup to hash file will definitely be faster. Well here is what I would suggest TableA --> HashA | TFM --> HashA --> TableA Load every thing from Table A to Hash A in the same job. Then make a lookup to the hash file in a transformer and load the same hash file. Remember that you should ha...
- Mon Aug 16, 2004 5:34 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Automatically handle activities that fail
- Replies: 11
- Views: 11697
- Mon Aug 16, 2004 5:19 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Automatically handle activities that fail
- Replies: 11
- Views: 11697
- Mon Aug 16, 2004 4:48 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: PeopleSoft EPM ASCL jobs
- Replies: 8
- Views: 2410
- Mon Aug 16, 2004 3:28 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Automatically handle activities that fail
- Replies: 11
- Views: 11697
Automatically handle activities that fail
Hi All, I have a Sequencer Job that calls a Server Job. I want the Sequencer Job to abort/throw warnings if the called Server Job aborts/throws warnings. I know one way to implement this is to call a routine Utility????ToLog also from the Sequencer Job. But since I am on version 7.1 I checked the op...
- Fri Aug 06, 2004 11:23 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Ascential user roles and privileges
- Replies: 3
- Views: 1346
- Fri Aug 06, 2004 11:19 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Run-time error '6'
- Replies: 4
- Views: 1332
Sorry the ddl didn't come properly. Here is the post again: Thanks for the suggestion guys but the error seems to be something else. No the error is not system specific. I tried it from two different systems. Its gives the same error in both. I went through the ddl of the tables also and it looks ok...