Date Conversion cache disabled, job aborts
Moderators: chulett, rschirm, roy
-
- Premium Member
- Posts: 20
- Joined: Mon Dec 19, 2005 10:00 pm
- Location: UK
Date Conversion cache disabled, job aborts
Hi,
Sequential file --> transformer --> surrogate key generator --> Oracle enterprise stage.
When I try to run the job, it aborts with the below error afer loading 23023 records into the table. But even more rows are read from the sequential file.
APT_CombinedOperatorController(1),0: Caught unknown exception from runLocally().
APT_CombinedOperatorController(1),0: The runLocally() of the operator failed.
APT_CombinedOperatorController(1),0: Operator terminated abnormally: runLocally() did not return APT_StatusOk
The log file says
Path used: Direct - with parallel option.
Table NIP:
23023 Rows successfully loaded.
0 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Date conversion cache disabled due to overflow (default size: 1000)
Bind array size not used in direct path.
Column array rows : 5000
Stream buffer bytes: 256000
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 23023
Total logical records rejected: 0
Total logical records discarded: 0
Total stream buffers loaded by SQL*Loader main thread: 2464
Total stream buffers loaded by SQL*Loader load thread: 0
In the table there are 2 date fields and 2 timestamp fields, these are converted from varchar to date/timestamp. I tried changing the data and yet the same number of records is loaded throwing the same error. Can someone help with the error please?
Sequential file --> transformer --> surrogate key generator --> Oracle enterprise stage.
When I try to run the job, it aborts with the below error afer loading 23023 records into the table. But even more rows are read from the sequential file.
APT_CombinedOperatorController(1),0: Caught unknown exception from runLocally().
APT_CombinedOperatorController(1),0: The runLocally() of the operator failed.
APT_CombinedOperatorController(1),0: Operator terminated abnormally: runLocally() did not return APT_StatusOk
The log file says
Path used: Direct - with parallel option.
Table NIP:
23023 Rows successfully loaded.
0 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Date conversion cache disabled due to overflow (default size: 1000)
Bind array size not used in direct path.
Column array rows : 5000
Stream buffer bytes: 256000
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 23023
Total logical records rejected: 0
Total logical records discarded: 0
Total stream buffers loaded by SQL*Loader main thread: 2464
Total stream buffers loaded by SQL*Loader load thread: 0
In the table there are 2 date fields and 2 timestamp fields, these are converted from varchar to date/timestamp. I tried changing the data and yet the same number of records is loaded throwing the same error. Can someone help with the error please?
-
- Participant
- Posts: 3337
- Joined: Mon Jan 17, 2005 4:49 am
- Location: United Kingdom
-
- Premium Member
- Posts: 20
- Joined: Mon Dec 19, 2005 10:00 pm
- Location: UK
I tried setting Disable_combination to true and found that the problem is with Surrogate key stage. But the log is the same in director.
DBA is not of much help as they say the DATE_CHACHE has to be increased when sqlldr is used. I guess DataStage is using it by default.
In surrogate key stage, i m using a sequence which is created by DBA and not by DataStage. Any idea how i can create the sequence using the stage as suggested in the Parallel job developer guide.
One more thing, the sequence ends with a value 23219 in the table where as the next value in the sequence is 23221
Any thoughts on this please?
DBA is not of much help as they say the DATE_CHACHE has to be increased when sqlldr is used. I guess DataStage is using it by default.
In surrogate key stage, i m using a sequence which is created by DBA and not by DataStage. Any idea how i can create the sequence using the stage as suggested in the Parallel job developer guide.
One more thing, the sequence ends with a value 23219 in the table where as the next value in the sequence is 23221
Any thoughts on this please?
-
- Premium Member
- Posts: 20
- Joined: Mon Dec 19, 2005 10:00 pm
- Location: UK
-
- Premium Member
- Posts: 20
- Joined: Mon Dec 19, 2005 10:00 pm
- Location: UK
Craig, Running the job without surrogate key stage is sucessful and 20 million records are successfully loaded into the table. I guess this has to do with surrogate key stage which is throwing strange error.
Surrogate_Key_Generator_26,0: Caught unknown exception from runLocally().
Surrogate_Key_Generator_26,0: The runLocally() of the operator failed.
Surrogate_Key_Generator_26,0: Input 0 consumed 23220 records.
Surrogate_Key_Generator_26,0: Output 0 produced 23219 records.
Surrogate_Key_Generator_26,0: Operator terminated abnormally: runLocally() did not return APT_StatusOk
Surrogate_Key_Generator_26,0: Caught unknown exception from runLocally().
Surrogate_Key_Generator_26,0: The runLocally() of the operator failed.
Surrogate_Key_Generator_26,0: Input 0 consumed 23220 records.
Surrogate_Key_Generator_26,0: Output 0 produced 23219 records.
Surrogate_Key_Generator_26,0: Operator terminated abnormally: runLocally() did not return APT_StatusOk
-
- Premium Member
- Posts: 20
- Joined: Mon Dec 19, 2005 10:00 pm
- Location: UK