Search found 357 matches
- Tue Nov 13, 2007 12:33 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Environment Variables
- Replies: 9
- Views: 7945
Hi TPQ, As I mentioned earlier do not use the job parameter which you picked from the environment variables list ($Server_Name) to be used in the DB2 API stage. Create a parameter which is unique to the job say Server_NAME. Create a job sequence and add the environment variable $Server_Name in the j...
- Tue Nov 13, 2007 10:46 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Environment Variables
- Replies: 9
- Views: 7945
Hi, Are you using the DB2 Enterprise Stage or DB2 API stage? Try connecting to DB2 using these 2 stages. Dont use the environment variable for now. Just create a job parameter say server and try to use the parameter. I have had the same issue with Teradata Multiload stage. It does not accept paramet...
- Thu Dec 28, 2006 6:55 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: string to Decimal conversion
- Replies: 2
- Views: 2225
- Fri Dec 22, 2006 1:53 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: How to run the next job in the sequence?
- Replies: 3
- Views: 1787
Hi Kumar, Pls frame your question properly in order to get a quick and correct reply. What I understand is that if the 3rd job got aborted, the next time you run the sequence you want to continue with the 3rd job. Is that so? If that is the case use the checkpoint option available in the job propert...
- Fri Aug 11, 2006 3:45 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Getting record count from Transformer
- Replies: 6
- Views: 7778
- Tue Aug 08, 2006 8:38 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Linux parallel config problem
- Replies: 3
- Views: 2186
Hi John, We are facing the same issue that you have mentioned. We have a Teradata API stage which works fine. But if it is pulled into a shared container the osh script generates DSCAPIOP_ before all the parameter names and causes a connect failure. We are running Datastage 7.5.2 on Linux Suse. Can ...
- Fri Aug 04, 2006 10:37 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Combinability mode. What is it used for?
- Replies: 4
- Views: 1560
Hi, By default the stage operators can be combined. If you want to override for a specfic stage you can set the combinability mode for the stage to Dont Combine. For Fault analysis you have to use the environment variable APT_DISABLE_COMBINATION to True and run the job. The job log will show exactly...
- Wed Jul 12, 2006 7:30 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Writing correct English in this forum when asking questions!
- Replies: 21
- Views: 6533
Aoccdrnig to a rscheearch at an Elingsh uinervtisy, it deosn't mttaer in waht oredr the ltteers in a wrod are, the olny iprmoetnt tihng is taht frist and lsat ltteer is at the rghit pclae. The rset can be a toatl mses and you can sitll raed it wouthit porbelm. Tihs is bcuseae we do not raed ervey lt...
- Tue Jul 11, 2006 3:51 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Unable to access the existing job
- Replies: 2
- Views: 2045
Hi, Do list_readu $DSHOME/bin/list_readu This will show the list of open jobs and who is working on that. If you want to release the lock on the job do this 1) Login in into the Datastage server box from the command line as datastage adminstrator 2) goto Home directory i.e DSEngine directory 3) ente...
- Tue Jul 11, 2006 3:38 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Select access to sys.dba_extents
- Replies: 9
- Views: 3642
- Fri Jun 30, 2006 9:08 am
- Forum: Site/Forum
- Topic: Video Tech Tip Series - FEEDBACK
- Replies: 49
- Views: 153647
- Fri Jun 02, 2006 2:45 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Updating records in SCD type 2
- Replies: 6
- Views: 2237
Hi Seanc, From the CDC take the output to the transformer. In the tranformer have 2 constraints. 1. change_code=1 or change_code=3 (New Inserts) 2. change_code=3 (Old record Updates) The output from the constraint 2 should go to a join stage where you join with the old records. Now you have 2 stream...
- Fri Jun 02, 2006 2:21 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: FATAL ERROR
- Replies: 5
- Views: 5157
Hi, Code 132 means that the dataset is not available. Code 1 means the File path is incorrect. Code 134?? Are you using $APT_CONFILE_FILE. The Datatype should be FilePath and it should point to the right Configuration file. But Iam not sure whether I got Code 138 or Code 134. Kumar, You dont get any...
- Thu Apr 20, 2006 3:12 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Type Conversion Using Modify Stage
- Replies: 4
- Views: 3508
- Thu Apr 13, 2006 2:40 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: How to join Header and Trailer records
- Replies: 10
- Views: 3613