Search found 27 matches

by rkacham_DSX
Tue Oct 14, 2014 7:58 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Amazon redshift
Replies: 1
Views: 2909

Amazon redshift

we have Information server 9.1. Is there anyway we can connect and extract data from amazon redshift from 9.1?. i have reached out to IBM they said that its supported in 11.3. We have immediate requirement.

Thanks in advance
by rkacham_DSX
Thu Feb 23, 2012 3:08 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: SAP BW plugin
Replies: 2
Views: 1744

SAP BW plugin

HI, we have been using SAP BAPI plugin just found that we have licences for SAP BW plugin not the SAP BAPI, Can we call BAPI from SAP BW stage or is there any work around to extract SAP R/3 table data using SAP BW stage

Thanks in advance
Ramesh
by rkacham_DSX
Fri Aug 21, 2009 12:52 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Restarting an aborted job sequence
Replies: 4
Views: 2903

Re: Restarting an aborted job sequence

Hi ,
We are having similar issue any updates on this ..did this issue got resolved..
by rkacham_DSX
Fri Aug 21, 2009 12:49 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Sequencer not working properly
Replies: 5
Views: 3031

Re: Sequencer not working properly

Hi,
We are also migrating to DS8 having similar issue.
when ever we restart aborted job from aborted seq the job is finishing with see log and seqence is finishing with warnings . which is causing issue at master sequnce level..
is ther any setting we need to do
by rkacham_DSX
Wed Jun 03, 2009 12:09 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Parallel job reports failure ERROR
Replies: 4
Views: 2837

Parallel job reports failure ERROR

Hi ,

I am getting this error when we run more jobs in parallel
Parallel job reports failure (code -99)
just wondering do i need to change any process param in LInux
we are on REDhat linux..

Datastage 8.1
by rkacham_DSX
Tue Mar 11, 2008 12:05 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: sap r/3 connection issue from datastage
Replies: 2
Views: 2400

Thanks for the reply, Can you advice what special charactres datastage accepting for sap.
by rkacham_DSX
Thu Mar 06, 2008 3:31 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: sap r/3 connection issue from datastage
Replies: 2
Views: 2400

sap r/3 connection issue from datastage

Hi, SAP system admin changed sap system password , it was connecting fine before they chagned the password, when they changed the password i went and changed in datastage sap administrator and in the jobs, I am getting RFC Error: Name or password is incorrect (repeat logon) . The strange thing is i ...
by rkacham_DSX
Thu Feb 21, 2008 4:19 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Datastage 8 new features
Replies: 2
Views: 2835

Datastage 8 new features

Hi,
what are the new features in datastge 8.0, is there any document
which lists the new features of datastge8

Thanks for your help in advance.
by rkacham_DSX
Sat Mar 03, 2007 12:37 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: ODBC: Unrecognized argument error
Replies: 1
Views: 1969

ODBC: Unrecognized argument error

Hi, we are getting following error randomly in pX job SrcCommonPropertyODBC: Unrecognized argument: SELECT DISTINCT CP.Common_Property_ID, CP.Originating_Network_ID ,rtrim(ltrim(Pet.Title)) Title FROM common_property CP inner join Property P on CP.Common_Property_ID=P.Common_Property_ID and CP.Origi...
by rkacham_DSX
Sat Mar 03, 2007 9:25 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Extending memory from 2gd to 3 or 4gb
Replies: 0
Views: 1030

Extending memory from 2gd to 3 or 4gb

Hi, We have px installed on solaris, we have 16gb memory, as datastage on solaris is 32 bit ,it can only use 4gb , i think each project can use only 2gb of memory, we have situation to use one of our projects more than 2gb, i was going through manual it says for hp to extend project level memory get...
by rkacham_DSX
Thu Mar 01, 2007 9:22 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Range lookups in PX
Replies: 3
Views: 1919

Range lookups in PX

Hi,

as we can do range lookups in server job using odbc/oracle stage can we do range lookups in px.how is the performance.

select col1 where col2 >=? and co3<=? ?* this in server*/
is there any suggestion to do range lookups in datastge px,

thanks for the help
by rkacham_DSX
Wed Dec 27, 2006 1:47 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Fatal error Default node pool empty
Replies: 2
Views: 2066

Fatal error Default node pool empty

Hi , I am getting following error: ain_program: Fatal Error: The set of available nodes for op5 (parallel CommonEpisodeStg). is empty. This set is influenced by calls to addNodeConstraint(), addResourceConstraint() and setAvailableNodes(). If none of these functions have been called on this operator...
by rkacham_DSX
Tue Jul 25, 2006 10:46 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Accessing Universe/hashfiles using odbc stage in PX jobs
Replies: 4
Views: 2480

In IBM Datstaage Advanced devloper calss, they said the only way to access the hash fiels in parllel jobs is using odbc stage... is there any other way to access hashfiles in parallel jobs? we have project in server jobs further devlopement to this project we are trying to do in parallel jobs so we ...
by rkacham_DSX
Tue Jul 25, 2006 9:53 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Accessing Universe/hashfiles using odbc stage in PX jobs
Replies: 4
Views: 2480

Accessing Universe/hashfiles using odbc stage in PX jobs

Hi , I am trying to access universe /hashfiles using odbc stge in parallel jobs, i am able to connect to unverse using odbc stage from my server jobs, when i try this in parallel jobs i'm not able to connect to universe using odbc.. i was wondering do we need to change any entries in .odbc.ini files...