Search found 202 matches

by pavankvk
Thu Jan 05, 2006 7:59 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: DB2 Fix Pack 10 problems with Datastage
Replies: 1
Views: 741

not sure what the fix pack did..but we took it off and we are back to normal..

i guess datastage is having tough time to get partition information(we use db2 partition option in db2 stage) with new fix pack 10.

any pointers appreciated
by pavankvk
Thu Jan 05, 2006 6:10 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: DB2 Fix Pack 10 problems with Datastage
Replies: 1
Views: 741

DB2 Fix Pack 10 problems with Datastage

Hi

we recently upgraded our db2 with fix pack 10. We are having performance problems with regular loads. The jobs are running just fine in production for over a year and we never had any issues with earlier fix packs.

Datastage 7.01 R1
Aix 5
DB2 UDB V8

is anyone in a similar situation?
by pavankvk
Thu Dec 29, 2005 4:45 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Transpose a row Question in Parallel jobs
Replies: 18
Views: 9699

What you're trying to achieve is called a vertical pivot - you can search the Forum for techniques. For a small, finite number of rows the stage variables approach is indicated, but you need to ensure that all values for each pivot key occur on the same processing node. Do this by partitioning and ...
by pavankvk
Thu Dec 29, 2005 2:07 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Transpose a row Question in Parallel jobs
Replies: 18
Views: 9699

Transpose a row Question in Parallel jobs

Hi

i have a requirement which is as below.

i have 2 rows
A,P1
A,P2

A is the key. i want a ouput row as

A,P1,P2

I did this once using awk,but its too slow for huge volumes of data. So i want this to be done in datastage. Is it possible?

tia
pavan
by pavankvk
Thu Dec 08, 2005 3:26 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: APT_BadAlloc: Heap allocation failed.
Replies: 8
Views: 5258

i get the following messages in the log.. APT_CombinedOperatorController(1),0: The current soft limit on the data segment (heap) size (2147483645) is less than the hard limit (2147483647), consider increasing the heap size limit APT_CombinedOperatorController(1),0: Current heap size: 2,141,571,120 b...
by pavankvk
Thu Dec 08, 2005 3:25 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: APT_BadAlloc: Heap allocation failed.
Replies: 8
Views: 5258

How can we know our current allocation for osh?
by pavankvk
Mon Dec 05, 2005 4:25 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Upgrading to 7.5a from 7.01r1
Replies: 1
Views: 651

Upgrading to 7.5a from 7.01r1

Hi

we are upgrading as specified..Are there any open issues with this upgrade?

Does it require any code changes?
by pavankvk
Mon Dec 05, 2005 4:22 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Date validation
Replies: 3
Views: 1584

what we did is..

if u know the format of date like if it comes in 12/5/2005 we use to parse it and then build 12-05-2005 and then use the existing functions.
by pavankvk
Mon Dec 05, 2005 4:20 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: How to divide a file into 8 parts
Replies: 10
Views: 5256

generate a unique sequence number using the surrogate key stage,then have a transformer,then use Mod() function and check value to 1 thru 8. u shud have a 8-way configuration file.
by pavankvk
Thu Dec 01, 2005 5:35 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Ascential Certification
Replies: 9
Views: 4950

ray,

i am unable to find a link on IBM site for this test..can u provide me please?
by pavankvk
Tue Nov 29, 2005 7:09 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Adding description to jobs
Replies: 8
Views: 3783

Adding description to jobs

Is there any way in dataStage where we can change the repository information in backend as in informatica In one of our project , the requirement is to add the description in all jobs (there are 500 + jobs ). Now instead of open each job and then add description in it, can we do something directly i...
by pavankvk
Tue Nov 15, 2005 1:15 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Do we need to worry about the warnings?
Replies: 4
Views: 1364

Do we need to worry about the warnings?

Do we really need to worry about the warnings that appear in the log..my jobs have many numberous warnings,but i never fixed them..it always performs what it is supposed to do..no issues..

let me know if getting rid of warnings is really important..
by pavankvk
Tue Nov 15, 2005 1:10 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Null Handling
Replies: 3
Views: 2101

Re: Null Handling

Hi, While I am trying to convert datatypes for few columns like Varchar to Decimal, Varchar to Date and Varchar to Timestamp which are not nullable on the target table, I am getting following warnings: 1) APT_CombinedOperatorController,2: Numeric string expected, got "". Use Decimal defau...
by pavankvk
Tue Nov 15, 2005 1:04 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: How can i call a web servie residing on a webserver?
Replies: 3
Views: 1248

How can i call a web servie residing on a webserver?

Can i call a webservice residing on a web server from datastage?

if yes pls let me know..i am using 7.5
by pavankvk
Tue Nov 08, 2005 12:05 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: How to increase default 2gb for Hashed files.
Replies: 2
Views: 1200

How to increase default 2gb for Hashed files.

Hi, our record length is 930 bytes. we need to store it in a hashed file.The default size is 2gb for hashed file.Till recently we didnt have any problem.recently volume of data increased and the hash file stage is failing..how can we increase the limit of 2gb. i checked ulimit for my id and its unli...