Execute Command stage calling SQLLDR

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
bigpoppa
Participant
Posts: 190
Joined: Fri Feb 28, 2003 11:39 am

Execute Command stage calling SQLLDR

Post by bigpoppa »

I'm trying to use a sequencer stage - the execute command stage to call script that has a sqlldr command in it. When I run the the script from the UNIX commandline, it works fine. When I run the script from the Execute Command stage, I get the error below. So, I thought there might be a discrepancy in my environment, but there's no difference between my UNIX env and the ENV that shows up in the Director log for the job. BTW, I'm using Oracle9i.

Here's the error:
---
SQL*Loader-128: unable to begin a session
ORA-01034: ORACLE not available
ORA-27101: shared memory realm does not exist
SVR4 Error: 2: No such file or directory
--


Also, I found this diagnostic on the web, but it doesn't mean much to me. Anyone care to interpret this for me?:

------
2) For remote (TCP/listener) connections:

Review the listener.ora currently used to startup the listener
and verify the ORACLE_HOME is correct for all listed 8i
databases.

If the ORACLE_HOME points to the 8.1.7 software yet the database
was created with 8.1.6 or 8.1.5, then this error can occur.


Explanation
-----------

Previously when the ORACLE_HOME or ORACLE_SID was set incorrectly,
in the 'oracle' or client user's environment, only ora-1034 was reported.

ORA-01034 "ORACLE not available"

With 8.1.7, the ora-27101 is reporting that the shared memory
key generated by the client doesn't match any currently
existing keys. This is to be expected if the ORACLE_HOME used by
the client isn't the same as the one used to startup the database
with or the ORACLE_SID is not correctly referencing the right
instance.
-----

Thanks,
BP
kduke
Charter Member
Charter Member
Posts: 5227
Joined: Thu May 29, 2003 9:47 am
Location: Dallas, TX
Contact:

Post by kduke »

Poppa

A DataStage job runs without loading the .profile or /etc/profile. You need to put ORACLE_HOME and all the other shell variables in .dsenv and stop and start DataStage. Do a search this has been covered.

Kim.
Mamu Kim
vmcburney
Participant
Posts: 3593
Joined: Thu Jan 23, 2003 5:25 pm
Location: Australia, Melbourne
Contact:

Post by vmcburney »

In version 7.0 the environment variables are displayed in the job log when a job is run:
"Environment variable settings:
APT_ORCHHOME=
DSHOME=/apps/Ascential/DataStage/DSEngine
HOME=/export/home/dsadm"

Poppa's problem is that the variables shown by his job log are the same as those he gets from the Unix prompt, so why doesn't his Oracle work? The only suggestion I can make is that you put both sets of environment variables into text files and use a text file comparison tool to make absolutely sure there are no differences. You may find there is a variable set in your dsenv file that has a minor mispelling.
Post Reply