Search found 12 matches
- Wed Mar 09, 2005 7:32 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Warning in Remove Duplicates Stage
- Replies: 2
- Views: 1872
Warning in Remove Duplicates Stage
Hi, while using Remove duplicates stage I get this warning. "Remove_Duplicates_22: When checking operator: User inserted sort "Remove_Duplicates_22.DSLink27_Sort" does not fulfill the sort requirements of the downstream operator "Remove_Duplicates_22"" Please explain ho...
- Wed Feb 02, 2005 3:13 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: How to load data from oracle partitioned table in Px.
- Replies: 3
- Views: 5125
How to load data from oracle partitioned table in Px.
Hi, I have to load data from a range partitioned oracle table, partitioned by date on weely basis. My target table is a non partitioned oracle table. I need to know what oracle stage settings do i need to make to load the data from this partitioned table successfully. The source table contains aroun...
- Tue Feb 01, 2005 8:56 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: How to capture oracle exceptions in DS Px
- Replies: 12
- Views: 5921
How to capture oracle exceptions in DS Px
Hi. I need to capture oracle raised exceptions like Inserting Null into Non-nullable field, lookup failure exception, or any other exception that would cause datastage to abort the job while running. I wish to capture such atributes in a separate table. Please suggest if there is a method to do so i...
- Fri Jan 28, 2005 12:18 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: What is the best way to load Huge Volume of History data?
- Replies: 3
- Views: 2298
What is the best way to load Huge Volume of History data?
Hi,
What is the best method to load voluminous data into oracle tables using a parallel job in DS. What stages can i use that would be most helpful in fast loading?
What is the best method to load voluminous data into oracle tables using a parallel job in DS. What stages can i use that would be most helpful in fast loading?
- Thu Jan 27, 2005 11:38 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: What is the best way to load Huge Volume of History data?
- Replies: 1
- Views: 1466
What is the best way to load Huge Volume of History data?
Hi,
What is the best method to load voluminous data into oracle tables using a parallel job in DS. What stages can i use that would be most helpful in fast loading?
What is the best method to load voluminous data into oracle tables using a parallel job in DS. What stages can i use that would be most helpful in fast loading?
- Thu Jan 27, 2005 9:49 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: How to prevent skip in oracle sequence in PX Job?
- Replies: 1
- Views: 2781
How to prevent skip in oracle sequence in PX Job?
Hi, I have developed a parallel job in datastage that loads the target table and assigns an oracle sequence for every record inserted. During update, it only updated the record without incrementing the surrogate key sequence. Earlier the oracle sequence was created with a cache=20 but then the DBA r...
- Thu Jan 27, 2005 3:06 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: How to prevent skip in oracle sequence in DS Parallel Job
- Replies: 2
- Views: 2573
How to prevent skip in oracle sequence in DS Parallel Job
Hi, I have developed a parallel job in datastage that loads the target table and assigns an oracle sequence for every record inserted. During update, it only updated the record without incrementing the surrogate key sequence. Earlier the oracle sequence was created with a cache=20 but then the DBA r...
- Thu Dec 23, 2004 5:15 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Error in executing Datastage job
- Replies: 8
- Views: 7324
- Thu Dec 23, 2004 2:57 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Variable file length when used in Datastage fails
- Replies: 2
- Views: 1766
Variable file length when used in Datastage fails
We are validating flat files using a Datastage job. We validate only till a length of 121 , but at some instances we receive files with higher lengths. The entire file is read into a single column in Sequential file and we select fixed width for the sequential file. In such cases the job currently f...
- Thu Dec 23, 2004 2:47 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Error in executing Datastage job
- Replies: 8
- Views: 7324
Re: Error in executing Datastage job
We have developed a server job. We are using the DsrunJob utlity to execute the server job. if the Job is running for a longer duration then it gives the following return code 141. But if the same job is running for short durations then it does not produce the error. We found a macro in file Sqlext...
- Wed Dec 22, 2004 3:44 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Error in executing Datastage job
- Replies: 8
- Views: 7324
Error in executing Datastage job
We have developed a server job. We are using the DsrunJob utlity to execute the server job. if the Job is running for a longer duration then it gives the following return code 141. But if the same job is running for short durations then it does not produce the error. We found a macro in file Sqlext....
- Mon Nov 08, 2004 9:07 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Error while compiling a simple parallel job
- Replies: 1
- Views: 1738
Error while compiling a simple parallel job
Hi, When i am trieng to complie the simple parallel job it is giving the following error. My job consits of one oracle Enterprise stage as source one transformer and sequential file as target, it is direct mapping from source to target no functions and calculations used in trasformer stage. ##I TFCN...