Search found 38 matches

by udayanguha
Wed Feb 25, 2015 2:36 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Record load getting committed even after job abort
Replies: 14
Views: 5979

Hi, I check both the environment variables i.e. $APT_ORAUPSERT_COMMIT_ROW_INTERVAL and $APT_ORAUPSERT_COMMIT_TIME_INTERVAL They are set to default value on project level in the administrator (5000 and 2 respectively). I want to override this property in my job so I used these variables in my job. I ...
by udayanguha
Sun Feb 22, 2015 8:06 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Partition method for creating key change column
Replies: 2
Views: 1884

Partition method for creating key change column

Hi, I am trying to create a key change column through sort stage. In the partition tab, shall I specify it as auto partition and Datastage will take care of the best partitioning method or shall I explicitly mention a hash partition in the property? I have heard different views from people. Some peo...
by udayanguha
Thu Feb 19, 2015 1:34 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Import error for decimal field in csv file
Replies: 5
Views: 2910

Hi,
Can anyone help how can I set the option Final Delimiter = 013 in the sequential file stage. DS is not letting me type anything after a single character.
by udayanguha
Wed Feb 18, 2015 9:20 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Import error for decimal field in csv file
Replies: 5
Views: 2910

Hi Ray, Thanks a lot. One of the solutions worked and the other I could not understand. I set the record delimeter string = DOS format and it worked fine for me. I could not implement the 2nd solution you suggested i.e. Final Delimiter = 013 Datastage doesn't allow me to set any value more than 1 ch...
by udayanguha
Wed Feb 18, 2015 8:50 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Import error for decimal field in csv file
Replies: 5
Views: 2910

Import error for decimal field in csv file

Hi, I am trying to read a csv file with 22 fields last of which is a decimal field named BOAT_HASH_KEY (decimal 38,0). The field is set to NULLABLE in output column properties of the sequential file stage. Format of sequential file stage is as below: Record level Final Delimeter = end Field defaults...
by udayanguha
Tue Feb 17, 2015 2:05 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Record load getting committed even after job abort
Replies: 14
Views: 5979

Hi,
Its not a bulk load which I am trying to do. A normal load through oracle connector.
Record count property has been set to 0
by udayanguha
Mon Feb 16, 2015 8:41 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Record load getting committed even after job abort
Replies: 14
Views: 5979

Hi Ray,
Thanks!!
I have already set the property record count = 0 under transaction but still some data is getting loaded after job abort. Is there some other property as well?
by udayanguha
Mon Feb 16, 2015 8:24 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Record load getting committed even after job abort
Replies: 14
Views: 5979

Record load getting committed even after job abort

Hi, I am loading some data into an Oracle table using Oracle connector. I want the load to roll back completely from the Oracle table if the job aborts. For this I have set the record count property under Isolation level to '0'. But still I can see records in the target table though my job has abort...