Search found 11 matches

by om.ranjan
Mon Oct 19, 2009 5:42 pm
Forum: Information Analyzer (formerly ProfileStage)
Topic: How to perform Bulk delete oe optimize delete operation
Replies: 4
Views: 2846

ArndW wrote:How about partitioning your database according to this key, then a delete operation could involve just removing one or more database partitions. ...
No partitioning has been used.

Thanks,
Ranjan
by om.ranjan
Fri Oct 16, 2009 4:56 pm
Forum: Information Analyzer (formerly ProfileStage)
Topic: How to perform Bulk delete oe optimize delete operation
Replies: 4
Views: 2846

How to perform Bulk delete oe optimize delete operation

Hi,

I have to delete million of record every quarter from a database (Purge operation), It will read from file and get unique ID record which need to be deleted from ORACLE database table.

in this regards
How to perform Bulk delete?

Thanks,
Ranjan
by om.ranjan
Mon Oct 05, 2009 6:27 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Source has data but not populating in Oracle
Replies: 11
Views: 5639

nagarjuna wrote:Hi Kim ,

I think they have mentioned "load" as an option . I believe there is no reject for it .
Hi Kim,

Nag is correct, there is no reject link.

Thanks,
Ranjan
by om.ranjan
Thu Oct 01, 2009 12:07 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Source has data but not populating in Oracle
Replies: 11
Views: 5639

om.ranjan wrote:
ArndW wrote:I bet your PX job has the datatype "VarChar" for this column. Declare the type as numeric in your job and perform an explicit conversion in PX. ...
It is decimal (11,3) NULLABLE

Hi All,

Any update on this issue will be appreciated.

Thanks,
Rnajna
by om.ranjan
Thu Oct 01, 2009 12:04 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: SQL*Loader-2026: the load was aborted because SQL Loader can
Replies: 3
Views: 16033

nagarjuna wrote:Speak to DBA ....
Tablespace has been extended, job are running fine now..

Thanks,

Ranjan :D
by om.ranjan
Thu Oct 01, 2009 11:10 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Joba are keep on running without loading single record into
Replies: 10
Views: 5946

Do you have the same issue if you run on a single node? for upsert you must have unique index on the table, define that unique column(s) as key in the table metadata, and hash partition on that key I'm already specified unique index column as a key in DataStage; when I tried with Hash partition, it...
by om.ranjan
Thu Oct 01, 2009 10:56 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: SQL*Loader-2026: the load was aborted because SQL Loader can
Replies: 3
Views: 16033

SQL*Loader-2026: the load was aborted because SQL Loader can

Hi, I have created a Job in DataStage to load data into a Target using ORACLE enterprise stage .( In source table has approx. 200 million of records) The job is throwing below error message. . ORA-01653 : unable to extend table <schema_name.table_name>BK by 128 in tablespace D_UTMDM_1M_01 SQL*Loader...
by om.ranjan
Thu Oct 01, 2009 7:29 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Source has data but not populating in Oracle
Replies: 11
Views: 5639

ArndW wrote:I bet your PX job has the datatype "VarChar" for this column. Declare the type as numeric in your job and perform an explicit conversion in PX. ...
It is decimal (11,3) NULLABLE
by om.ranjan
Thu Oct 01, 2009 7:28 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Source has data but not populating in Oracle
Replies: 11
Views: 5639

Are you using an Oracle Enterprise stage? Are you capturing the rejected rows using a reject link? If so, push these rows into some structure (maybe a text file) that you can review with a hex edito ... Yes, Oracle Enterprise stage, I do have reject link in transformer not in Oracle Enterprise stag...
by om.ranjan
Fri Sep 11, 2009 2:36 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Joba are keep on running without loading single record into
Replies: 10
Views: 5946

Do you have the same issue if you run on a single node? for upsert you must have unique index on the table, define that unique column(s) as key in the table metadata, and hash partition on that key I'm already specified unique index column as a key in DataStage; when I tried with Hash partition, it...
by om.ranjan
Wed Sep 09, 2009 11:14 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Joba are keep on running without loading single record into
Replies: 10
Views: 5946

What is commit size? Do hash partition on the unique key column. I have introduced a environment variable $APT_ORAUPSERT_COMMIT_ROW_INTERVAL and set commit interval value 500, with this value initially job worked fine, but When I increase number of input records (up to 10000) again it getting into ...