Search found 59 matches
- Wed Feb 17, 2010 11:14 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Join of tables with huge volume of data - scratch space
- Replies: 8
- Views: 5151
Join of tables with huge volume of data - scratch space
Hi, I am doing a full outer join of two tables with table1 has 20million records and table2 has 1000million records. [Production data volumes] Job Design: SrcTbl1--->Join(with hash and sort by join keys) -->Transformer-->Tgt Tbl | SrcTbl2 ------ I have only one scratch disk with 35GB of space. Pleas...
- Wed Feb 17, 2010 10:32 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: execute command to check for a existance of a file on unix
- Replies: 8
- Views: 9376
That's because you are using the 'Automatically handle activities that fail option' but your triggers should override that if you are using the right pair of triggers out of the stage: 'Conditional (OK)' and 'Otherwise'. Are you saying that you are and it still logs the warning? :? ps. A Wait For F...
- Wed Feb 17, 2010 9:48 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Truncate and Load in Oracle Stage
- Replies: 13
- Views: 12132
Hi, I am doing same. I am taking upsert strategy and in that i have taken first update then insert and in update query i have written truncate statement. but still facing the same problem. Thanks Have you considered the option of truncate in a separate job that runs prior to your upsert job? in tha...
- Wed Feb 17, 2010 9:36 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: execute command to check for a existance of a file on unix
- Replies: 8
- Views: 9376
- Wed Feb 17, 2010 9:08 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: execute command to check for a existance of a file on unix
- Replies: 8
- Views: 9376
execute command to check for a existance of a file on unix
I am checking for a existance of a file (on unix) using Execute command stage (script: ls #Filename#) and in triggers: 1. if expression type 'OK' branching to file exists link, 2. if expression type 'otherwise' branching to file does not exists and I have other processes from these branches. It is r...
- Tue Feb 16, 2010 9:10 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Looping logic based on multiple conditions to end loop
- Replies: 11
- Views: 5193
What is the mode of transfer for these files to land in your sourcing directory? If ftp, then you also need to make sure that you do not start reading the files before its completely written. DSguru2B, We already taken care of this. APAP program writes these files into a Directory and as soon as it...
- Tue Feb 16, 2010 11:20 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Looping logic based on multiple conditions to end loop
- Replies: 11
- Views: 5193
Thank you every one for all your valuable details. My data files keep coming during 2 hrs span. I do not want to wait for 2hrs to start job to process all files at once as all files together will be around 50gb+. I want to leverage 2hrs time to process data in loop as and when files are keep coming....
- Mon Feb 15, 2010 9:27 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Looping logic based on multiple conditions to end loop
- Replies: 11
- Views: 5193
You end a loop early simply by branching out of it past the End Loop stage. Easiest way to do that is park a Sequencer set to 'Any' right after it and branch to that based on the filename found. ... Craig, Thank you for the reply! I can only end the loop if the control file is found and no more dat...
- Mon Feb 15, 2010 4:58 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Looping logic based on multiple conditions to end loop
- Replies: 11
- Views: 5193
Looping logic based on multiple conditions to end loop
I have a requirement to run a job in loop to read files (data-files) in file direcory and process them to load into a table. The files keep coming for 2hr as many as few thousands. but, once all the files are landed in the directory, there will be antoher file (control-file) that states that all the...
- Thu May 21, 2009 1:03 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Aggregator stage warning message : Implicit conversion
- Replies: 6
- Views: 12278
Mark the output field of your sum as Nullable: Yes. ... thank you for all you help. I had been doing by marking the field as nullable in the output. But it is not null in the input as well it will be loaded into Oracle table that too not null. As a work around I am marking the output in aggregator ...
- Tue May 19, 2009 2:30 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Aggregator stage warning message : Implicit conversion
- Replies: 6
- Views: 12278
- Tue May 19, 2009 9:54 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Aggregator stage warning message : Implicit conversion
- Replies: 6
- Views: 12278
Aggregator stage warning message : Implicit conversion
Job design: dataset 1 dataset 2 --> Funnel --> Aggregator --> Dataset3. I am doing a sum on source field "Sales_amt" decimal(38,10) not null to an output field "Sales_amt" of decimal(38,10) not null using Agregator stage. the output finally gets loaded into Oracle Target with dat...
- Fri May 01, 2009 9:52 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Performance on large look-up table
- Replies: 1
- Views: 1186
Performance on large look-up table
I have a source data volume of one million on Oracle Table and I have a dimension table to be looked up with a volume of 80 million records to get the DWID from the look-up table. I am using the join to do this on a 8 node configuration. It is taking approximately 15mins to process the data and load...
- Tue Sep 23, 2008 10:14 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: How to Pass dataset field value to job parameter/parameter s
- Replies: 2
- Views: 2006
How to Pass dataset field value to job parameter/parameter s
I have a job1 that retrives a control parameter (region name) from a table (teradata) and writes it to a dataset. My job2 has to read the region name from the dataset created in the job1 and pass it to the job2 as parameter in SQL selection criteria in teradata connector stage as parameter. Example:...
- Sun Sep 14, 2008 7:50 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: BW Load - Critical Failure, no job name for PULL job
- Replies: 6
- Views: 4383
Re: BW Load - Critical Failure, no job name for PULL job
Try using the Push method by making etl as the master . If SAP does not respond with proper info package name or with the request ID it is problem to be fixed by the SAP and not ETL Thank you all for you replies.. Answer to Ray's qn: Nothing being changed, niether on SAP side nor Infoserver/BW Pack...