Search found 38 matches
- Wed Feb 25, 2015 2:36 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Record load getting committed even after job abort
- Replies: 14
- Views: 5979
Hi, I check both the environment variables i.e. $APT_ORAUPSERT_COMMIT_ROW_INTERVAL and $APT_ORAUPSERT_COMMIT_TIME_INTERVAL They are set to default value on project level in the administrator (5000 and 2 respectively). I want to override this property in my job so I used these variables in my job. I ...
- Sun Feb 22, 2015 8:06 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Partition method for creating key change column
- Replies: 2
- Views: 1884
Partition method for creating key change column
Hi, I am trying to create a key change column through sort stage. In the partition tab, shall I specify it as auto partition and Datastage will take care of the best partitioning method or shall I explicitly mention a hash partition in the property? I have heard different views from people. Some peo...
- Thu Feb 19, 2015 1:34 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Import error for decimal field in csv file
- Replies: 5
- Views: 2910
- Wed Feb 18, 2015 9:20 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Import error for decimal field in csv file
- Replies: 5
- Views: 2910
Hi Ray, Thanks a lot. One of the solutions worked and the other I could not understand. I set the record delimeter string = DOS format and it worked fine for me. I could not implement the 2nd solution you suggested i.e. Final Delimiter = 013 Datastage doesn't allow me to set any value more than 1 ch...
- Wed Feb 18, 2015 8:50 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Import error for decimal field in csv file
- Replies: 5
- Views: 2910
Import error for decimal field in csv file
Hi, I am trying to read a csv file with 22 fields last of which is a decimal field named BOAT_HASH_KEY (decimal 38,0). The field is set to NULLABLE in output column properties of the sequential file stage. Format of sequential file stage is as below: Record level Final Delimeter = end Field defaults...
- Tue Feb 17, 2015 2:05 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Record load getting committed even after job abort
- Replies: 14
- Views: 5979
- Mon Feb 16, 2015 8:41 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Record load getting committed even after job abort
- Replies: 14
- Views: 5979
- Mon Feb 16, 2015 8:24 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Record load getting committed even after job abort
- Replies: 14
- Views: 5979
Record load getting committed even after job abort
Hi, I am loading some data into an Oracle table using Oracle connector. I want the load to roll back completely from the Oracle table if the job aborts. For this I have set the record count property under Isolation level to '0'. But still I can see records in the target table though my job has abort...