Bulk Load to Oracle via Oracle connector

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
zaino22
Premium Member
Premium Member
Posts: 81
Joined: Thu Mar 22, 2007 7:10 pm

Bulk Load to Oracle via Oracle connector

Post by zaino22 »

DS 8.1.0 running on UNIX.

Hi forum,
I have a datastage Job that is currently using insert as write mode to load data to Oracle Table without an issue. I want to take advantge of Bulk Load option in Oracle Connector (OC), and process the Load faster but unfortunately I am getting error
TABLE_NAME: The OCI function executeDirect returned status -1. Error code: 942, Error message: ORA-00942: table or view does not exist. (CC_OraLoad::executeStatement, file CC_OraLoad.cpp, line 3,499)


Table does exist but Exception Table does not and I thought error is because of it (since I am referencing Exception table in OC). However, even after excepton table (as DBNAME.TABLENAME_EXPTN where table name is DBNAME.TABLENAME) is created I am still getting same error.

I have gone through manual and other forums and unable to find the reason for the error. I did find the post on DSX which asks about the reason for the very same error but unfortunately he did not get a response nor poster shared how he resolved the issue.

For the benefit of othes, I am explaining it in details, and hope someone will respond.

I see data gets to the ORACLE CONNECTOR stage no issue but then it erros out. Any help will be appreciated.

Below are other errors that are after the first error (above):
TABLE_NAME,0: Null argument: lc (CC_ConnectorFactory::createConnector[2], file CC_ConnectorFactory.cpp, line 883)
TABLE_NAME,0: Method not supported (CC_Connection::setEnvironment, file CC_Connection.cpp, line 289)
TABLE_NAME,0: The runLocally() of the operator failed.

Here is how my OC seetings look like.

Properties
===========

Variant= 10

Connection section:
These values are not changed when switching between Insert and bulk Load modes.

Usage section:
Write mode= Bulk Load
Generate SQL at runtime=No (deslected option)
Table name=DBNAME.TABLENAME
Table scope=Entire table

Enable quoted identifiers=No
Table action=Append

Session:
Array size=2000
Drop unmatched field=yes
Process warning messages as fatal=No

Run before and after SQL=Yes
Read before sql statement from file=No
Fail on Error for Before SQL statement=Yes

Read After sql statement from file=No
Fail on Error for After SQL statement=Yes

Read before sql statement from file: No
Fail on error before sql statement: Yes

Read after sql statement from file: No
Fail on error after sql statement: Yes

Bulk Load options:

Allow concurrent load sessions: Yes
Buffer size= 1024
Use oracle date cache=yes
Cache size=1000
disable cache when full=yes

Index maintenance option=Skip All
Disable logging=yes

Perform operation before Bulk load=Yes
Disable contraints=yes
Disable triggers=yes


Perform operation after Bulk load=Yes
Rebuil indexes=yes
Parallel clause=PARALLEL
Degree of parallelism=8
loggin clause=no logging
Fail on error for index rebuilding=no

enable contraints=yes
Exception Table name=DBNAME.TABLENAME_EXPTN
process exception rows=No

enable triggers=yes
manage application failover=yes
number of retries=10
Time between retries=10
Replay Before SQL statement= No
Replay Before SQL (Node) statement=No
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Do both tables have the proper grants for the connecting user?
-craig

"You can never have too many knives" -- Logan Nine Fingers
Kryt0n
Participant
Posts: 584
Joined: Wed Jun 22, 2005 7:28 pm

Post by Kryt0n »

Just to be sure, by DBNAME you do mean the schema name, don't you?

Not sure if it applies to bulk load but
Run before and after SQL=Yes
may want to be "No"
zaino22
Premium Member
Premium Member
Posts: 81
Joined: Thu Mar 22, 2007 7:10 pm

Post by zaino22 »

@chulett: Good suggestion. Since table structures are same for DBNAME.TABLENAME
and DBNAME.TABLENAME_EXPTN I tried loading the DBNAME.TABLENAME_EXPTN and it was successful. That leads me to believe access are same for both tables.

@Kryt0n: yes, I referered schedma as DBNAME. Sorry for the confusion.
By the way I changed the value for "Run before and after SQL=No" but still same error.

Issue still outstanding...
zaino22
Premium Member
Premium Member
Posts: 81
Joined: Thu Mar 22, 2007 7:10 pm

Post by zaino22 »

So, its pretty much done deal, cannot be done pre 8.5 pack 1 Datastage versions.
Sorry for the lengthy post, I just wanted to be helfpul to those who will be searching in the future.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Thanks for posting all that but let's be specific about what cannot be done.

Obviously, people have been bulk loading into Oracle for years using DataStage. The issue is specific to the new Oracle Connector stage trying to bulk load to a table not owned by the connection account. For that specific case you'd need to be on at least 8.5 with FixPack 1 installed.

Just so we don't mislead any speed-reading future searchers. :wink:
-craig

"You can never have too many knives" -- Logan Nine Fingers
Post Reply