Page 1 of 1

Sqlldr error ...

Posted: Mon Feb 04, 2008 2:37 pm
by jseclen
Hi forum !!

My job is running in an AIX server with DS EE v 7.5.2, now i trying to insert record into oracle using the oracle enterprise plug in. In the sequential file i set the attribute Read First Rows = 150000 and run and insert without problems.

I change the value to 350000 and my job abort 30 minuts later, the error shown is the next

"Oracle_Enterprise_216,0: The call to sqlldr failed; the return code = 589,833; please see the loader logfile: /stag/parallel/node1/Scratch/ora.618582.967574.0.log for details."

This log file is empty, and search the forum for this problem, and i apply the changes for environment variables, in the dsenv file was commented the oracle variables, but the error appear when the value is increased ...

Anybody has an idea about this problem???

Thanks for your help.

Posted: Tue Feb 05, 2008 8:07 am
by DS_FocusGroup
Make sure your oracle paths are setup properly. its not being able to find the SQL loader

Posted: Tue Feb 05, 2008 11:08 am
by jseclen
Hi DS_FocusGroup,

The sqlldr was founded, because in previous load insert 150000 records, when this number is increased the process abort ...

Posted: Tue Feb 05, 2008 12:17 pm
by priyadarshikunal
jseclen wrote:Hi DS_FocusGroup,

The sqlldr was founded, because in previous load insert 150000 records, when this number is increased the process abort ...

Have you checked whether your log file is present or not

check your configuration file, from there take the scratch space path.

can you find the log file there?

check whether the oracle table constraint has not been violated.

because just by increasing the number of record you should not get the sql loader error.

and ofcourse check the table space for free space

by putting the query like


Code: Select all

select ((sum(bytes)/1024)/1024)/1024,tablespace_name from dba_free_space where tablespace_name='<tablespace>'

 

Posted: Wed Feb 13, 2008 6:13 pm
by seell
hi
I hit a similar problem; however I am using the newer version 8.0.1 running on windows 2003 server. my sqlldr return code is 7; Input is 9,999,999 records and it failed about 1.8 million records (trying with 800,000 and 1,5 milllion records were ok).

I have tried to use the normal sqlldr and all the data (9,999,999 records) into could be loaded into the same target table . Does this not mean that it is not a space issue on oracle ???

Posted: Wed Feb 13, 2008 7:46 pm
by chulett
Have you talked to your DBA about what might be going on?

Posted: Thu Feb 14, 2008 2:57 am
by seell
my DBA also checked the free space as mentioned in earlier responses; but he thinks that this is not the problem on temp tablespace;