I have created the same environment variables in the new environment with same name but did not migrate the param files.
But if I am creating the same environment variables in the new environment then why I need to copy the DSParams file contents in the new environment.
Search found 219 matches
- Wed Nov 28, 2007 12:27 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Environment variables in different environment not working
- Replies: 3
- Views: 1245
- Mon Nov 26, 2007 11:54 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Environment variables in different environment not working
- Replies: 3
- Views: 1245
Environment variables in different environment not working
We are having a development project where we have defined many environment variables. Few of these are for database connection properties such as connection credentials. Now when we are exporting the jobs to a new environment where all the environment variables are same as dev env, only the value of...
- Thu Nov 22, 2007 3:40 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Error while saving jobs in IIS
- Replies: 1
- Views: 800
Error while saving jobs in IIS
While saving jobs we are getting following error many times: - A database deadlock occurred during the operation. After this the job is not opening and when we click on that link then another error comes as The invalid CLink operation. It is happening so many times. Can any one suggest what can be t...
- Sat Oct 20, 2007 5:55 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Commit row interval for OCI stage
- Replies: 2
- Views: 899
Oddly enough I have a call open to support for a simliar issue - but in my case the settings for time-based and row-based commits are being ignored but the commit isn't done until the job finishes. What platform are you running on? Hi, I am using Oracle 10g and on Unix Box. I even tried and changed...
- Thu Oct 18, 2007 2:09 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Commit row interval for OCI stage
- Replies: 2
- Views: 899
Commit row interval for OCI stage
I am having more than 30 million records, and when I am inserting in the Oracle table via OCI stage it commits the transactions automatically after 64 records. I checked in my env variables settings, the $APT_ORAUPSERT_COMMIT_ROW_INTERVAL=5000 , here you can see that commit interval is 5000 but stil...
- Thu Sep 27, 2007 1:11 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Sequence Performance Query
- Replies: 4
- Views: 1918
You will also need to take into consideration that if you are firing these many jobs at the same time, CPU might run out of processing power and starts crapping out. (need to take hardware into consideration). Also logging information into the job log can be done by calling a custom routine that do...
- Wed Sep 26, 2007 4:08 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Sequence Performance Query
- Replies: 4
- Views: 1918
Sequence Performance Query
I wanted to know about sequence performance. Here I am having 100 plus small sequences that are having 4 - 5 jobs incorporated in each on an avg. Now these sub sequences are to be called in one master sequence which represents a Single Database. What can be the best approach for moving forward: - 1....
- Tue Sep 18, 2007 4:07 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Reject Handling in DB2 Enterprise Stage?
- Replies: 1
- Views: 745
Reject Handling in DB2 Enterprise Stage?
Hi, I am using DB2 enterprise stage in the job to load data in DB2. The job design is very simple i.e. Source Seq file > Transformer > DB2 stage. I want to have reject handling mechainism for datatype mismatch in DB2 stage. i.e if target coulumn is integer & if we encounter character data in sou...
- Tue Sep 18, 2007 1:55 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Problem with Change Capture Stage
- Replies: 7
- Views: 3285
- Mon Sep 17, 2007 4:52 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: fatal error
- Replies: 8
- Views: 2985
Re: fatal error
The call to sqlldr failed; the return code = 256; please see the loader logfile: /opt/product/Ascential/DataStage/OnePercent/Scratch/ora.8497.14949d.2.log for details., Please post full problem description. there might be a possibility that You are loading some record in SQL, in which a field has a...
- Mon Sep 17, 2007 4:37 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Problem with Change Capture Stage
- Replies: 7
- Views: 3285
Then can we pass the data with natural keys and this column (neither key nor expected to change) to a different stream and the required fields only to change capture data. Then we can join/lookup both streams and get the data in required format for the target database. Just a vauge idea just check ...
- Mon Sep 17, 2007 4:34 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Problem with Change Capture Stage
- Replies: 7
- Views: 3285
You might want to use Surrogate Key Generator after Change Capture Stage. But What if I do not want this. This field is required to be populated in the target table, but since it is generated from a Surrogate Key Generator, I can not calculate a change on it also it can not be defined as a key as I...
- Mon Sep 17, 2007 1:38 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Problem with Change Capture Stage
- Replies: 7
- Views: 3285
Probably not, since it's receiving its default value on the output. You want to have your cake and eat it too - to pass the column through without using it as a change column. That's not how Change ... But What if I do not want this. This field is required to be populated in the target table, but s...
- Mon Sep 17, 2007 12:34 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Problem with Change Capture Stage
- Replies: 7
- Views: 3285
Problem with Change Capture Stage
After reading multiple posts in the forum I got many inputs and tried those but still I am not able to remove this warning coming for Change Capture Stage. CC_MotorCover: When checking operator: Defaulting "QUOTE_POLICY_LINK_IDENTIFIER" in transfer from "beforeRec" to "outpu...
- Wed Sep 12, 2007 5:10 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Avoiding Defaulting of invalid data in integer in xfr
- Replies: 1
- Views: 890
Avoiding Defaulting of invalid data in integer in xfr
Hi, We have input data as char & target data as integer. We are loading data via transformer in Db2 using DB2 enterprise stage. If input data which is char is valid integer then xfr does implicit conversion and loads it into DB2. But if input data is invalid integer value then transformer defaul...