Hi Experts,
I am using Oracle Connector Stage and I want to process all the records in a single array and then only I want to commit to write it.
But when I am getting below error message when I am entering 0 in the Array Size Property of Oracle Connector Stage Oracle Connector Stage.
Property Array size has value(0) which is too small. The minimum allowed value is (1)
This is very strange. As from the Datastage Documentation, 0 should be entered when we wants all the records to be in Array.
But while entering 0 it is throwing an error message.
Please give your comments on the same.
Thanks,
Ankur
Array Size Property of Oracle Connector Stage
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 64
- Joined: Fri Feb 05, 2010 5:29 am
- Location: India, UK
-
- Participant
- Posts: 437
- Joined: Fri Oct 21, 2005 10:00 pm
There is no definitive size to use, 2000 is default. However you may find the it is more effecient to use 1000 or 10000. This number is going to depend upon the size of the data you are sending/receiving, your network, and database. I typically parameterize array and commit size so that the value can be altered and experiemented with easily to find what is the right number. Test several different numbers and see what array and commit sizes work best for you.
I will say on small volume jobs I don't mess with this option often, its just not worth the time.
I will say on small volume jobs I don't mess with this option often, its just not worth the time.
Keith Williams
keith@peacefieldinc.com
keith@peacefieldinc.com
-
- Participant
- Posts: 64
- Joined: Fri Feb 05, 2010 5:29 am
- Location: India, UK
-
- Participant
- Posts: 64
- Joined: Fri Feb 05, 2010 5:29 am
- Location: India, UK
I am trying setting APT_ORAUPSERT_COMMIT_ROW_INTERVAL = 0 which commits the records only when the job is finished.
But this is also not working. Even the Job is aborted still the records are going into the destination table.
Director Log
pscSrcStmLuErrors.pscErrorHandling.boErrorHandling,0: Severity 1 error (DS_LUF_8) encountered (FATAL,DataStage,Datastage Lookup Reject Fatal). Aborting Job.
pscSrcStmLuErrors.pscErrorHandling.boErrorHandling,0: The runLocally() of the operator failed.
pscSrcStmLuErrors.pscErrorHandling.boErrorHandling,0: Input 0 consumed 1 records.
oraSAS_INS_CST_TXN,0: Number of rows inserted on the current node: 148.
main_program: Step execution finished with status = FAILED.
Job jp0802020SASCustomerTransactions_wip_19thAugust aborted.
I want when the Job is aborted , nothing should be inserted into the Destination Table.
For Information , - pscSrcStmLuErrors.pscErrorHandling.boErrorHandling is called when some record get rejected, jobs are coded to call them.It issue Datastage FATAL , Severity 1 Error which actually aborts the job , but still records got inserted into the destination table.
But this is also not working. Even the Job is aborted still the records are going into the destination table.
Director Log
pscSrcStmLuErrors.pscErrorHandling.boErrorHandling,0: Severity 1 error (DS_LUF_8) encountered (FATAL,DataStage,Datastage Lookup Reject Fatal). Aborting Job.
pscSrcStmLuErrors.pscErrorHandling.boErrorHandling,0: The runLocally() of the operator failed.
pscSrcStmLuErrors.pscErrorHandling.boErrorHandling,0: Input 0 consumed 1 records.
oraSAS_INS_CST_TXN,0: Number of rows inserted on the current node: 148.
main_program: Step execution finished with status = FAILED.
Job jp0802020SASCustomerTransactions_wip_19thAugust aborted.
I want when the Job is aborted , nothing should be inserted into the Destination Table.
For Information , - pscSrcStmLuErrors.pscErrorHandling.boErrorHandling is called when some record get rejected, jobs are coded to call them.It issue Datastage FATAL , Severity 1 Error which actually aborts the job , but still records got inserted into the destination table.