Search found 42 matches

by Satwika
Mon Aug 05, 2013 8:49 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: How to get the design of a job from Job Executable
Replies: 5
Views: 3486

Thanks for your responses :

I have only executable.. even i am ready to design the new job.. but the problem is i do not have logic for the stages(like Transformer/aggregator) used in that dsx..
by Satwika
Mon Aug 05, 2013 6:13 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: How to get the design of a job from Job Executable
Replies: 5
Views: 3486

How to get the design of a job from Job Executable

Hi Everyone

Good Morning !!

I have a Job executable DSX. Can anyone please suggest me how to design the job from Job Executable. Please find the job details as below :

Type of Job : Parallel
Job Execuatable in Version : 7.5
Job has to design in 8.7 version

Thanks in Advance..
by Satwika
Mon Jan 21, 2013 1:03 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Row/sec keep decreasing when writing data into hash file
Replies: 9
Views: 4831

Its created with type 30 (Dynamic). In 8.5 i am facing this issue with rows/sec.
by Satwika
Fri Dec 28, 2012 12:36 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Row/sec keep decreasing when writing data into hash file
Replies: 9
Views: 4831

Ray ,would you mean the size of hash file during installation in 8.5 is identical to 7.5. In hash file stage ,under create file option button all values are identical in 7.5 and 8.5 job.In 8.5 job usually even get aborted after long time with warning like mention below. CopyOfGSAP_CONTROL_ETL_change...
by Satwika
Fri Dec 28, 2012 12:11 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Row/sec keep decreasing when writing data into hash file
Replies: 9
Views: 4831

Hi Thanks for providing path.This project is migration project.Same job in DS 7.5 able to load data in hash file in around 4 to 5 mins .Where rows/sec keep constant between 15000 to 20000 .But in DS 8.5 its keep decreasing and it goes till few hundreds record per sec.We have around 6 million records...
by Satwika
Thu Dec 27, 2012 11:01 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Row/sec keep decreasing when writing data into hash file
Replies: 9
Views: 4831

Would you please help me to find Hashed File Calculator .In hash file 'Minimum Modulus' is defined as 1 by default.
by Satwika
Fri Dec 21, 2012 5:06 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Row/sec keep decreasing when writing data into hash file
Replies: 9
Views: 4831

Row/sec keep decreasing when writing data into hash file

Hi, I am reading data from database(sql serv 2005 ) using OLDB stage and writing it into hash file.Data flow start by 25000 rows/sec and keep decreasing to 600 rows\sec in few minutes.In input I have around 5 millions records .Can you please help me ,how to maintain the rows\sec so that data will lo...
by Satwika
Fri Dec 21, 2012 2:56 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Unable to view data from OLDB stage for a user defined query
Replies: 3
Views: 7774

Unable to view data from OLDB stage for a user defined query

Hi , I am using ODBC stage as input stage . I am using mention below query as user defined query. DECLARE @AGGR_DT AS DATETIME SELECT @AGGR_DT=CONVERT(DATETIME,LTrim(RTrim(JOB_PARM_VAL))) FROM TEMP_JOB_CONFIG WHERE JOB_PARM_NM='AGGR_DT' SELECT top 10 EXTRNL_SCTY_ID,STCK_EXCH_ID,SCTY_ID_TYP,LTrim(RTr...
by Satwika
Thu Dec 13, 2012 4:30 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: unique alphanumeric keys with Datastage
Replies: 4
Views: 2555

Can you ellaborate the above solution.
by Satwika
Mon Dec 03, 2012 8:16 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Decimal value (00000000000000000000.000000) to blank it
Replies: 6
Views: 3759

Re: Decimal value (00000000000000000000.000000) to blank it

Whatever your suggestion would work out 2 & 3.What about condition 1? Even For 1st condition also it works . if you are multipling with 6 zeros then it replaces the decimals and makes it complete integer value. the 1st one's output is 1000. it is greater than 0 . so not required to replace with...
by Satwika
Mon Dec 03, 2012 4:51 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Decimal value (00000000000000000000.000000) to blank it
Replies: 6
Views: 3759

Re: Decimal value (00000000000000000000.000000) to blank it

Multiply with 10,00,000 and convert from decimal to integer and check the value is zero. If Yes, populate "" else value.
by Satwika
Mon Oct 01, 2012 4:12 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Number of readers per node --
Replies: 22
Views: 10977

Hi Andrw

I tried with single node, but still problem exists... any other suggestions please......
by Satwika
Fri Sep 28, 2012 8:50 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Number of readers per node --
Replies: 22
Views: 10977

Run your job in a 1-node configuration and see if the error remains. If it is still there with 1-node then your partitioning is not at the root of the problem. Hi Andrw I don't have access to change it single node, i can create the configuration file but can't able to use in job. Can I have any sug...
by Satwika
Fri Sep 28, 2012 12:34 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Number of readers per node --
Replies: 22
Views: 10977

Thanks Ray..I Understood .. but my basic problem has not solved.
Can anyone face this type issue. ? :oops:
Please Refer my post.
by Satwika
Thu Sep 27, 2012 3:53 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Number of readers per node --
Replies: 22
Views: 10977

Can anyone know this issue.... :?: