Search found 48 matches

by mansoor_nb
Wed Feb 17, 2010 6:06 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: APT_CombinedOperatorController,0: [NCR][ODBC Teradata Driver
Replies: 7
Views: 3588

This issue is resolved. The issue was with the Teradata user id that i was using. Thanks you very much for the replies that all have provided.
by mansoor_nb
Thu Jan 28, 2010 4:56 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: APT_CombinedOperatorController,0: [NCR][ODBC Teradata Driver
Replies: 7
Views: 3588

Why are you using ODBC anyways ? and i think you need to get the odbc drivers installed for teradata.. i think you should use the API stage instead Sparse lookup option is not available in the Teradata API stage and hence i am using ODBC stage. The ODBC drivers are installed for teradata. Thanks Ma...
by mansoor_nb
Thu Jan 28, 2010 3:56 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: APT_CombinedOperatorController,0: [NCR][ODBC Teradata Driver
Replies: 7
Views: 3588

APT_CombinedOperatorController,0: [NCR][ODBC Teradata Driver

Hello, I have a parallel job which uses two sparse lookup.The source is dataset and the target is the dataset. The job flow is as follows; a)The data is looked up with the table in DB2(reference table) for CUST_NBR(data type - Decimal 9,0). b) the CUST_NBR (data type - Decimal 9,0) is used to lookup...
by mansoor_nb
Fri Jul 10, 2009 1:29 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Retrieve SQLCODE in Oracle Loader
Replies: 3
Views: 1751

You need to include "SKIP_UNUSABLE_INDEXES" in the Oracle Load Options like "DIRECT=TRUE, SKIP_INDEX_MAINTENANCE=YES, SKIP_UNUSABLE_INDEXES=NO" or "DIRECT=TRUE, SKIP_INDEX_MAINTENANCE=TRUE, SKIP_UNUSABLE_INDEXES=FALSE" Then the skip_index_maintenance will prevail, no ma...
by mansoor_nb
Thu Jul 09, 2009 3:39 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Hi Performance job design
Replies: 5
Views: 2912

Instead of lookup, one can use the change capture stage to capture the insert and the update records. Also the change capture stage is performance efficient.
by mansoor_nb
Tue Mar 11, 2008 10:06 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Oracle Load Rejects
Replies: 4
Views: 1660

Hi Harsha, You can capture the reject records while you use the oracle load method. The rejected records while loading the data into the table gets saved in the scratch disk with the control file,log file and thebad file. The bad file is the bad data which failed to load the table. Once you use the ...
by mansoor_nb
Tue Mar 11, 2008 9:15 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Rows to column
Replies: 3
Views: 1632

You can use the combined record stage or you have to write a logic in the transformer to convert rows into columns.
by mansoor_nb
Mon Mar 03, 2008 4:15 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Loading SAP data through DataStage
Replies: 15
Views: 12032

There are several ways to load the data into the SAP system. 1)If the data which you are trying to load gets loaded into the SAP standard tables then you need to write BDC. a)First you need to prepare the data and place the file in the SAP system. b)once the file is created then the BDC(ABAP Code) s...
by mansoor_nb
Tue Dec 11, 2007 2:54 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: load single space into the table's field in Oracle(load opt)
Replies: 2
Views: 990

The data type of the field is Varchar2(50) and i have tried with 0x20 and 0x0 for APT_STRING_PADCHAR
by mansoor_nb
Tue Dec 11, 2007 1:12 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: load single space into the table's field in Oracle(load opt)
Replies: 2
Views: 990

load single space into the table's field in Oracle(load opt)

Hi all, I am extracting the data from teradata and loading into the oracle. the mapping is quite simple and it is one to one mapping but the data involved to be written into table in Oracle is huge and that's why i am using the Bulk load options. I have to pass single space into one of the field of ...
by mansoor_nb
Wed Jul 11, 2007 12:48 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Can we sort the data with two key columns.
Replies: 4
Views: 2338

yes you can sort or join the data using two columns. If the data to be joined between the two links then it should be hashed and sorted on the key columns and then the join should be made on the key columns and the partition type should be SAME.
by mansoor_nb
Tue Jul 03, 2007 1:16 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: How to write in same file which is also a source file ?
Replies: 5
Views: 4437

you can do one thing, create the output staging file and then in the after job subroutine delete the source file and rename the output staging file with the source file name.
by mansoor_nb
Mon Mar 19, 2007 1:02 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: How to Filter certain Date values
Replies: 13
Views: 5253

Check for the length of the input column, if the length is less than 10 then convert it into mm/dd/yyyy else pass the input column to the target.
by mansoor_nb
Thu Mar 08, 2007 5:29 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Transformer logic
Replies: 6
Views: 3723

you can also use the NUM function to check whether the source column is numeric or not.It will return 1 if the source column is numeric otherwise it will return 0
by mansoor_nb
Thu Mar 08, 2007 5:25 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: type conversion
Replies: 8
Views: 2302

You will get these kind of results if your source string field is converted properly into the timestamp format before conversion or there migght be nulls in your source string column.

If there are nulls then check for nulls and then convert it into timestamp.