Hi,
I am trying to see if its possible to load data dymanically from sequential file to salesforce with RCP enabled.
Sequential file ( with schema file) ---> salesforce plug in.
Search found 198 matches
- Mon Jun 01, 2015 6:11 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Would it be possible to load data dynamically into salesforc
- Replies: 6
- Views: 2878
- Fri May 29, 2015 3:00 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Exception while invoking a webservice
- Replies: 2
- Views: 2507
I have tested the invocation from SOAPUI and it works fine using the XML that is being generated in the Peek stage. I am assuming the issue could be one of the following - There is some special way to handle SOAP Headers. I have tried one in option 3 but still doesn't work - Internally within webser...
- Tue Nov 15, 2011 12:34 am
- Forum: General
- Topic: Access Duration
- Replies: 11
- Views: 5136
- Mon Nov 14, 2011 9:06 pm
- Forum: General
- Topic: Access Duration
- Replies: 11
- Views: 5136
- Wed Nov 09, 2011 3:29 am
- Forum: General
- Topic: Access Duration
- Replies: 11
- Views: 5136
- Tue Nov 08, 2011 10:26 pm
- Forum: General
- Topic: Access Duration
- Replies: 11
- Views: 5136
Hi, At our corporate office, we are maintaining a datastage server for employees to practice. We have different employees who request for access. But not all use it for practice and our list of employees using datastage server is increasing day by day. The management wanted to monitor the frequency ...
- Tue Nov 01, 2011 12:01 pm
- Forum: General
- Topic: Access Duration
- Replies: 11
- Views: 5136
Access Duration
Hi All,
I need to create a shell script to generate a report on Access Duration of All Users.
Can anyone share their thoughts on this?
Thanks
I need to create a shell script to generate a report on Access Duration of All Users.
Can anyone share their thoughts on this?
Thanks
- Sat Jan 29, 2011 1:55 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: tpump - read/write to table in same job
- Replies: 0
- Views: 1513
tpump - read/write to table in same job
Hi, Browsing through a project I came through jobs, which perform SCD on dimension tables. These jobs use tpump to read/insert/update the table. All these jobs are working fine. I thought designing jobs in such scenario would result in deadlock. Will deadlock situation not occur for row level locks?...
- Tue Aug 03, 2010 4:51 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: full outer join (returning defaults)
- Replies: 15
- Views: 9104
Hi, I have looked at the schema's. While unloading from teradata, the schema is defined as Not Nullable for few columns, even though I have specified them as nullable in metadata. This got carried forward to most of the stages until a transformer is used. Is there a best way to change it without usi...
- Fri Jul 30, 2010 6:47 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: full outer join (returning defaults)
- Replies: 15
- Views: 9104
Hi Ray, I have enabled the OSH_PRINT_SCHEMA and tried to look in the logs. But the log doesn't give schema information of each link. It just gives info about one of the inputs (a dataset) and not for the other inputs (teradata enterprise). I am looking at the OSH Script in the director log. Please l...
- Wed Jul 28, 2010 8:25 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: full outer join (returning defaults)
- Replies: 15
- Views: 9104
Try setting the environment variable that prints the record schema used by each operator into the log. From memory it's OSH_PRINT_SCHEMAS (but please do check). ... Hi Ray I tried to enable OSH_PRINT_SCHEMAS by making it TRUE, but can't find it in director log. I have gone through the logs, but was...
- Wed Jul 28, 2010 8:24 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: full outer join (returning defaults)
- Replies: 15
- Views: 9104
Try setting the environment variable that prints the record schema used by each operator into the log. From memory it's OSH_PRINT_SCHEMAS (but please do check). ... Hi Ray I tried to enable OSH_PRINT_SCHEMAS by making it TRUE, but can't find it in director log. I have gone through the logs, but was...
- Mon Jul 26, 2010 5:26 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: full outer join (returning defaults)
- Replies: 15
- Views: 9104
- Sun Jul 25, 2010 8:29 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: full outer join (returning defaults)
- Replies: 15
- Views: 9104
I have checked whether the record schema has default values, but it doesn't have any default values (Neither provision for providing default values, I pressed right click on the column def and selected Edit Row). For the KeyFields I have used the same names. Full Outer Join is creating two fields (l...
- Sun Jul 25, 2010 8:13 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: full outer join (returning defaults)
- Replies: 15
- Views: 9104
full outer join (returning defaults)
Hi, I am using FullOuterJoin in my job. All its inputs from start of the stream (this is the first job in the stream) have been kept as nullable (Yes). When I try to join based on Key fields say KeyField_1, KeyField_2. I am getting default values instead of nulls when no match is found. I managed to...