Hi forum !!
My job is running in an AIX server with DS EE v 7.5.2, now i trying to insert record into oracle using the oracle enterprise plug in. In the sequential file i set the attribute Read First Rows = 150000 and run and insert without problems.
I change the value to 350000 and my job abort 30 minuts later, the error shown is the next
"Oracle_Enterprise_216,0: The call to sqlldr failed; the return code = 589,833; please see the loader logfile: /stag/parallel/node1/Scratch/ora.618582.967574.0.log for details."
This log file is empty, and search the forum for this problem, and i apply the changes for environment variables, in the dsenv file was commented the oracle variables, but the error appear when the value is increased ...
Anybody has an idea about this problem???
Thanks for your help.
Sqlldr error ...
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 133
- Joined: Wed Mar 05, 2003 4:19 pm
- Location: Lima - Peru. Sudamerica
- Contact:
Sqlldr error ...
Saludos,
Miguel Seclén
Lima - Peru
Miguel Seclén
Lima - Peru
-
- Premium Member
- Posts: 197
- Joined: Sun Jul 15, 2007 11:45 pm
- Location: Prague
-
- Premium Member
- Posts: 1735
- Joined: Thu Mar 01, 2007 5:44 am
- Location: Troy, MI
jseclen wrote:Hi DS_FocusGroup,
The sqlldr was founded, because in previous load insert 150000 records, when this number is increased the process abort ...
Have you checked whether your log file is present or not
check your configuration file, from there take the scratch space path.
can you find the log file there?
check whether the oracle table constraint has not been violated.
because just by increasing the number of record you should not get the sql loader error.
and ofcourse check the table space for free space
by putting the query like
Code: Select all
select ((sum(bytes)/1024)/1024)/1024,tablespace_name from dba_free_space where tablespace_name='<tablespace>'
Priyadarshi Kunal
Genius may have its limitations, but stupidity is not thus handicapped.
Genius may have its limitations, but stupidity is not thus handicapped.
hi
I hit a similar problem; however I am using the newer version 8.0.1 running on windows 2003 server. my sqlldr return code is 7; Input is 9,999,999 records and it failed about 1.8 million records (trying with 800,000 and 1,5 milllion records were ok).
I have tried to use the normal sqlldr and all the data (9,999,999 records) into could be loaded into the same target table . Does this not mean that it is not a space issue on oracle ???
I hit a similar problem; however I am using the newer version 8.0.1 running on windows 2003 server. my sqlldr return code is 7; Input is 9,999,999 records and it failed about 1.8 million records (trying with 800,000 and 1,5 milllion records were ok).
I have tried to use the normal sqlldr and all the data (9,999,999 records) into could be loaded into the same target table . Does this not mean that it is not a space issue on oracle ???