Hi All,
I have to run a job from command line, and pass an environment variable, called in my job every time through dsjob and -param option.
Can anyone suggest what is the correct syntax for it.
Search found 131 matches
- Fri Mar 12, 2010 2:06 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Passing environment variable through dsjob
- Replies: 2
- Views: 1773
- Fri Mar 05, 2010 11:38 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: warning-node_db2node3: Warning: unable to chdir
- Replies: 8
- Views: 4201
Re: warning-node_db2node3: Warning: unable to chdir
I want to add to the previous note that, /data/ds/Projects/dpr_aos_dev is existing and dpr_aos_dev folder has following privilages:- cbidev01 /data/ds/Projects $>ls -ltr total 39392 drwxrwsr-x 51 dsadm dstage 4096 Mar 04 23:20 dpr_aos_dev drwxrwsr-x 2972 dsadm dstage 192512 Mar 05 13:22 dpr_iccm_de...
- Fri Mar 05, 2010 11:36 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: warning-node_db2node3: Warning: unable to chdir
- Replies: 8
- Views: 4201
Re: warning-node_db2node3: Warning: unable to chdir
I want to add to the previous note that, /data/ds/Projects/dpr_aos_dev is existing and dpr_aos_dev folder has following privilages:- cbidev01 /data/ds/Projects $>ls -ltr total 39392 drwxrwsr-x 51 dsadm dstage 4096 Mar 04 23:20 dpr_aos_dev drwxrwsr-x 2972 dsadm dstage 192512 Mar 05 13:22 dpr_iccm_de...
- Fri Mar 05, 2010 4:53 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: warning-node_db2node3: Warning: unable to chdir
- Replies: 8
- Views: 4201
The second log entry in the director show which file is actually being used at runtime, please do check there after a run to make 100% certain that this is the actual configuration file being used. Furthermore, does that directory exist on both nodes? Actual config file used is:- main_program: APT ...
- Fri Mar 05, 2010 2:47 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: warning-node_db2node3: Warning: unable to chdir
- Replies: 8
- Views: 4201
Are you certain that the config file you posted is what is actually being used (check the 2nd entry in the director log for the run to confirm), since the dpr_aos_dev directory is not contained in the config file EBI_TwoNode.apt. What stage is node_db2node3 that is triggering the warning? Thanks fo...
- Fri Mar 05, 2010 2:47 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: warning-node_db2node3: Warning: unable to chdir
- Replies: 8
- Views: 4201
Are you certain that the config file you posted is what is actually being used (check the 2nd entry in the director log for the run to confirm), since the dpr_aos_dev directory is not contained in the config file EBI_TwoNode.apt. What stage is node_db2node3 that is triggering the warning? Thanks fo...
- Fri Mar 05, 2010 1:57 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: warning-node_db2node3: Warning: unable to chdir
- Replies: 8
- Views: 4201
Re: warning-node_db2node3: Warning: unable to chdir
I want to add to the previous note that, /data/ds/Projects/dpr_aos_dev is existing and dpr_aos_dev folder has following privilages:- cbidev01 /data/ds/Projects $>ls -ltr total 39392 drwxrwsr-x 51 dsadm dstage 4096 Mar 04 23:20 dpr_aos_dev drwxrwsr-x 2972 dsadm dstage 192512 Mar 05 13:22 dpr_iccm_dev...
- Fri Mar 05, 2010 1:51 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: warning-node_db2node3: Warning: unable to chdir
- Replies: 8
- Views: 4201
warning-node_db2node3: Warning: unable to chdir
Hi All, I have created one new project on the same ds,and able to successfully connect to DB2 database which is on other server,and here i m using same config file which is used in other running project. Now when i run the job,i m getting one warning message which seems not create any problem now bu...
- Mon Mar 01, 2010 11:16 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Transformer Null handling and reject link
- Replies: 8
- Views: 3774
- Tue Feb 16, 2010 4:23 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Hashing keys and grouping columns
- Replies: 2
- Views: 1243
- Tue Feb 16, 2010 3:59 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Hashing keys and grouping columns
- Replies: 2
- Views: 1243
Hashing keys and grouping columns
Hi All, I am having one confusion in job.......like in some of the jobs prior to aggrigator stage data is hash partitioned on lets say A,B & C columns and in aggrigator, grouping is done on A,B,C,D(where D is not constent). Will the result correct and what will be the impact on performance ?? th...
- Thu Dec 03, 2009 2:08 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Job Sequencer
- Replies: 14
- Views: 5878
- Fri Nov 13, 2009 8:20 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: (Subroutine failed to complete successfully (30107))
- Replies: 2
- Views: 3659
(Subroutine failed to complete successfully (30107))
Hi Ray, With due respect i Need your urgent suggestion.We are not able connect to the DataStage. Getting the below error Ist Time got this error: "Error calling subroutine: *DataStage*DSR_SELECT (Action=4); check DataStage is set up correctly in project NiharTest (Subroutine failed to complete ...
- Tue Nov 10, 2009 3:55 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Unable to initialize plug-in
- Replies: 33
- Views: 26779
- Mon Sep 28, 2009 11:17 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: working of join stage!
- Replies: 16
- Views: 6335
The join stage will first sort the input streams according to your join key(s), unless that has been already done or explicitly turned off. That way the join mechanism can process the data without having to store large amounts of interim data in memory or on disk. Each Node is completely distinct a...