Page 1 of 1

Date conversion cache disabled due to overflow (default size

Posted: Fri Aug 31, 2007 6:17 am
by Shadab_Farooque
Hi,

I am loading 6 million records from sequential file to Oracle .
My Job gets aborted at 3 million.
The error message is "Date conversion cache disabled due to overflow (default size: 1000)".

Please let me know how to handle this .

Regards

Posted: Fri Aug 31, 2007 6:22 am
by Shadab_Farooque
Giving more details below

I have one Date column in my Job.The input date value in the txt file is like 2007-12-31.I am using StringToDate function to convert this into date format.The input txt file has 6 million records.
The Job gets aborted at 3 million throwing error as
"Date conversion cache disabled due to overflow (default size: 1000)".

Please help.

Re: Date conversion cache disabled due to overflow (default

Posted: Fri Aug 31, 2007 6:39 am
by JoshGeorge
You are using sqlldr for loading and this is an error thrown by sqlldr.
Shadab_Farooque wrote:The error message is "Date conversion cache disabled due to overflow (default size: 1000)".

Re: Date conversion cache disabled due to overflow (default

Posted: Fri Aug 31, 2007 6:46 am
by Shadab_Farooque
I am Using Oracle Enterprise for loading the data.
It works as sqlloader in background.

How to increase the date cache and Errors allowed parameters in Datastage 7.5.1(PX)

Posted: Fri Aug 31, 2007 6:50 am
by JoshGeorge
Date cache is set to 1000 by default, you may want to try either increasing the size of the date cache or disabling the cache by specifying DATE_CACHE=0. Consider adding/changing your options in $APT_ORACLE_LOAD OPTIONS to match your requirement.

Date cache:
Max Size: 1000

Max Size is the maximum number of entries in the cache. By default, the date cache size is 1,000 elements, but it can be changed with the SQL*Loader DATE_CACHE command-line parameter.

Posted: Fri Aug 31, 2007 7:14 am
by Shadab_Farooque
Hi George,

Thanks for your input.

Please let me know how to make DATE_CACHE=0 in $APT_ORACLE_LOAD OPTIONS

Posted: Fri Aug 31, 2007 10:24 pm
by bkumar103
why dont you use directly sqlldr to load the data after formating the input data. I didnt get any problem even loading 7 million of records.

Posted: Mon Oct 20, 2008 12:20 am
by siauchun84
bkumar103, how to use the direct sqlldr to load the data? I have 9 millions of records to load but it aborted at 6 million or 7 million point. Please help.