Search found 394 matches

by samsuf2002
Tue Jun 02, 2009 9:51 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Null handling in transformer
Replies: 19
Views: 11588

Are you able to view those records when you click "view data" in sequential file. What is the job design ?
by samsuf2002
Mon Jun 01, 2009 10:19 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Records Duplicating
Replies: 12
Views: 5120

Try using a join or lookup stage for joining instead of SQL.
by samsuf2002
Mon Jun 01, 2009 10:04 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Null handling in transformer
Replies: 19
Views: 11588

Are those columns set as not nullable ? If yes then set them to nullable in seq file and transformer.

What is your job design ?
When you make it as nullable then edit the column and set the null field values = ''.
by samsuf2002
Mon Jun 01, 2009 3:49 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Null handling in transformer
Replies: 19
Views: 11588

Try making the column nullable and if it's an integer then make it varchar ....if it suits your requirement..... later do the nullhandling.

Null handling for integer will be ---> if Isnull(col1) = 0 then 'Unknown'......
by samsuf2002
Mon Jun 01, 2009 3:38 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Records Duplicating
Replies: 12
Views: 5120

Are you doing a join ?

Posting the sql you are using will help us to help you.
by samsuf2002
Tue May 12, 2009 9:44 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: update and insert
Replies: 4
Views: 2886

Thanks for the response Mike actually time stamp field is not a part of key here.

what I am doing right now is extracting the update and insert into two different file and then the updating and inserting the table in two seperate job, which seems to be working good.
by samsuf2002
Tue May 12, 2009 8:48 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: update and insert
Replies: 4
Views: 2886

update and insert

Hi, I have a requirement where I want to update the existing records to 'close' which has a status 'open' and I also want to add a new record for the same key with status 'open' and current timestamp. I am using oracle stage. So if there are 5 open records then I should have 10 records in the output...
by samsuf2002
Wed May 06, 2009 1:14 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Row comparison in parallel
Replies: 3
Views: 1676

This is a pivot requirement (rows to columns), search the forum for "Pivot".
by samsuf2002
Fri Apr 10, 2009 9:41 am
Forum: General
Topic: Dataset
Replies: 11
Views: 2474

Thanks to all for the valuable information.

I am still wondering about those empty files created for same dataset in the same dsn folder (example shown in my first post). If anyone can put some light on it.
by samsuf2002
Wed Apr 08, 2009 7:54 am
Forum: General
Topic: Dataset
Replies: 11
Views: 2474

Dataset

Hi All, I am working on datasets and want to clarify some of my doubts, I am running a job that creates a dataset data.ds in my working folder and the data stored in dsn directories (0 -7), but what I see is that all the records (1200 rows) stored in just one dsn folder that is "dsn0" and ...
by samsuf2002
Fri Mar 27, 2009 11:11 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Sequence calling another Sequence
Replies: 2
Views: 1262

Is your new job activity in SEQ2 running? If you have added a job activity in SEQ1 and named it SEQ2 then when you run it obviously it will run the old jobs with your new job as per the sequence. If you just want to run your new jobs then you need to create a seperate sequence. if I understood you c...
by samsuf2002
Fri Mar 27, 2009 11:02 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Truncate & Load in Oracle Stage job is aborting due to F
Replies: 2
Views: 2107

If you are not using RCP, then check the columns from source dataset to target oracle table if any column is missing.
by samsuf2002
Fri Mar 27, 2009 10:59 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: still getting the same sequential file warning
Replies: 2
Views: 1481

Did you try clearing the partition from Datasets ?
by samsuf2002
Fri Mar 27, 2009 10:55 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: CDC Stage Issue...
Replies: 6
Views: 3276

Make sure the columns you are comparing in CDC are same in both input links.
by samsuf2002
Fri Mar 20, 2009 8:41 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Webservices
Replies: 18
Views: 6411

Just noticed that I don't have 'jre' in java directory for path "/appl/datastage/IBM/InformationServer/Server/DSEngine/java/jre/bin/sovvm/libjvm.a.
"