DS 8.1.0 running on UNIX.
Hi forum,
I have a datastage Job that is currently using insert as write mode to load data to Oracle Table without an issue. I want to take advantge of Bulk Load option in Oracle Connector (OC), and process the Load faster but unfortunately I am getting error
TABLE_NAME: The OCI function executeDirect returned status -1. Error code: 942, Error message: ORA-00942: table or view does not exist. (CC_OraLoad::executeStatement, file CC_OraLoad.cpp, line 3,499)
Table does exist but Exception Table does not and I thought error is because of it (since I am referencing Exception table in OC). However, even after excepton table (as DBNAME.TABLENAME_EXPTN where table name is DBNAME.TABLENAME) is created I am still getting same error.
I have gone through manual and other forums and unable to find the reason for the error. I did find the post on DSX which asks about the reason for the very same error but unfortunately he did not get a response nor poster shared how he resolved the issue.
For the benefit of othes, I am explaining it in details, and hope someone will respond.
I see data gets to the ORACLE CONNECTOR stage no issue but then it erros out. Any help will be appreciated.
Below are other errors that are after the first error (above):
TABLE_NAME,0: Null argument: lc (CC_ConnectorFactory::createConnector[2], file CC_ConnectorFactory.cpp, line 883)
TABLE_NAME,0: Method not supported (CC_Connection::setEnvironment, file CC_Connection.cpp, line 289)
TABLE_NAME,0: The runLocally() of the operator failed.
Here is how my OC seetings look like.
Properties
===========
Variant= 10
Connection section:
These values are not changed when switching between Insert and bulk Load modes.
Usage section:
Write mode= Bulk Load
Generate SQL at runtime=No (deslected option)
Table name=DBNAME.TABLENAME
Table scope=Entire table
Enable quoted identifiers=No
Table action=Append
Session:
Array size=2000
Drop unmatched field=yes
Process warning messages as fatal=No
Run before and after SQL=Yes
Read before sql statement from file=No
Fail on Error for Before SQL statement=Yes
Read After sql statement from file=No
Fail on Error for After SQL statement=Yes
Read before sql statement from file: No
Fail on error before sql statement: Yes
Read after sql statement from file: No
Fail on error after sql statement: Yes
Bulk Load options:
Allow concurrent load sessions: Yes
Buffer size= 1024
Use oracle date cache=yes
Cache size=1000
disable cache when full=yes
Index maintenance option=Skip All
Disable logging=yes
Perform operation before Bulk load=Yes
Disable contraints=yes
Disable triggers=yes
Perform operation after Bulk load=Yes
Rebuil indexes=yes
Parallel clause=PARALLEL
Degree of parallelism=8
loggin clause=no logging
Fail on error for index rebuilding=no
enable contraints=yes
Exception Table name=DBNAME.TABLENAME_EXPTN
process exception rows=No
enable triggers=yes
manage application failover=yes
number of retries=10
Time between retries=10
Replay Before SQL statement= No
Replay Before SQL (Node) statement=No
Bulk Load to Oracle via Oracle connector
Moderators: chulett, rschirm, roy
@chulett: Good suggestion. Since table structures are same for DBNAME.TABLENAME
and DBNAME.TABLENAME_EXPTN I tried loading the DBNAME.TABLENAME_EXPTN and it was successful. That leads me to believe access are same for both tables.
@Kryt0n: yes, I referered schedma as DBNAME. Sorry for the confusion.
By the way I changed the value for "Run before and after SQL=No" but still same error.
Issue still outstanding...
and DBNAME.TABLENAME_EXPTN I tried loading the DBNAME.TABLENAME_EXPTN and it was successful. That leads me to believe access are same for both tables.
@Kryt0n: yes, I referered schedma as DBNAME. Sorry for the confusion.
By the way I changed the value for "Run before and after SQL=No" but still same error.
Issue still outstanding...
Thanks for posting all that but let's be specific about what cannot be done.
Obviously, people have been bulk loading into Oracle for years using DataStage. The issue is specific to the new Oracle Connector stage trying to bulk load to a table not owned by the connection account. For that specific case you'd need to be on at least 8.5 with FixPack 1 installed.
Just so we don't mislead any speed-reading future searchers.
Obviously, people have been bulk loading into Oracle for years using DataStage. The issue is specific to the new Oracle Connector stage trying to bulk load to a table not owned by the connection account. For that specific case you'd need to be on at least 8.5 with FixPack 1 installed.
Just so we don't mislead any speed-reading future searchers.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers