Search found 22 matches

by jenny_wang
Mon Jun 29, 2009 9:05 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: how to solve the "Abnormal termination" in XMLInpu
Replies: 1
Views: 1385

how to solve the "Abnormal termination" in XMLInpu

hi,guys I have a job is designed like this Folder-->XmlInput-->Transformer-->Sequential File. when the source file is around 800M, there will be "Abnormal termination of stage XmlInput detected. I searched in the forum, and someone said this can be solved by setting APT_DEFAULT_TRANSPORT_BLOCK_...
by jenny_wang
Tue Oct 21, 2008 1:53 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: performace issue with updating a huge number of records
Replies: 5
Views: 2565

hi,mike
actually most of the records should be updated,not all of them.
by jenny_wang
Thu Oct 16, 2008 7:49 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: performace issue with updating a huge number of records
Replies: 5
Views: 2565

Thanks,Mike
could you give me some suggestions about the design?
the oracle stage only do updating without inserting.
so the performance is bad.
by jenny_wang
Wed Oct 15, 2008 2:12 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: performace issue with updating a huge number of records
Replies: 5
Views: 2565

performace issue with updating a huge number of records

I create a job for update records in a table. the job is design like this: DataSet----->Join(Inner)<----Oracle(Read) :o :o :o :o | :o :o :o :o | :o :o :o :o | Oracle(Update) if the dataset and oracle(read) don't have so many records, the update speed is ok. but if the Oracle(Read) output about 200 t...
by jenny_wang
Mon Sep 01, 2008 2:32 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Deadlock detected while inserting into Oracle table
Replies: 6
Views: 5288

try to use sequential mode when insert.
by jenny_wang
Tue Aug 12, 2008 9:01 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: issues with a generic job for update foreign key of tables
Replies: 0
Views: 786

issues with a generic job for update foreign key of tables

hi,all in the database, there are some tables which have foreign keys, and the tables the foreign keys reference will keep all historial records(if a record is updated, the older one is marked as 'O', the lastest is marked as 'N', and there is a column to identify the same record), now i wanna updat...
by jenny_wang
Mon Jul 21, 2008 2:23 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Lookup Stage get more records than primary link records
Replies: 8
Views: 4568

I remove the duplicated records and the job works well. thanks for help!
by jenny_wang
Mon Jul 14, 2008 3:47 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Lookup Stage get more records than primary link records
Replies: 8
Views: 4568

hi, mahadev if there exist duplicated entries, warnings will show up in the log. i have a column named site_unit, the oracle output two columns named site_key and site_unit, if the two site unit are identical, the site_key is output or the site_key is null. I am not clear about what you said " ...
by jenny_wang
Mon Jul 14, 2008 3:26 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Lookup Stage get more records than primary link records
Replies: 8
Views: 4568

hi,Ray
which partition should be used? how can i know whether this is caused by wrong partition?

i found that there are duplicated entries in the oracle output, but there is no warnings in the log.
by jenny_wang
Mon Jul 14, 2008 3:02 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Lookup Stage get more records than primary link records
Replies: 8
Views: 4568

Lookup Stage get more records than primary link records

the job use a lookup stage, the primary link is reading records from a records.ds. the reference link is a oracle stage, the records.ds have 22697 records and the oracle stage output 304 records,but the lookup stage output 68091 records.

could you please help me to find out why this happened ?
by jenny_wang
Wed Mar 19, 2008 12:13 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Problem with converting EBCDIC to ASCII
Replies: 10
Views: 11376

I tried char(),but the DS can't recognize the 8-bit ascii, so I must enable the NLS and set the mapping character set as "ISO8859-1+MARKS", right?

Thanks!
by jenny_wang
Tue Mar 18, 2008 12:59 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Problem with converting EBCDIC to ASCII
Replies: 10
Views: 11376

actually I have 4 columns have those diacritics characters, and I am not sure how many different diacritics characters will be used in the column. does that mean if I wanna convert those characters correctly, first I must enable NLS,then code as PhilHibbs post to replace all diacritics characters wi...
by jenny_wang
Tue Mar 18, 2008 12:56 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Problem with converting EBCDIC to ASCII
Replies: 10
Views: 11376

actually I have 4 columns have those diacritics characters, and I am not sure how many different diacritics characters will be used in the column. does that mean if I wanna convert those characters correctly, first I must enable NLS,then code as PhilHibbs post to replace all diacritics characters wi...
by jenny_wang
Sat Dec 29, 2007 5:19 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: write data with different number of columns to a single file
Replies: 2
Views: 1503

write data with different number of columns to a single file

3 output link in a transformer link1 has 5 columns , like: 1,2,3,5,4 22,33,34,16,33 link2 has 8 columns : 1,2,4,6,5,4,3,5 100,34,23,12,55,65,77,44 link3 has 7 columns: 1,1,34,33,55,4,3 2,4,42,32,44,55,33 i wanna write them into a single file: 1,2,3,5,4 22,33,34,16,33 1,2,4,6,5,4,3,5 100,34,23,12,55,...