Date conversion cache disabled due to overflow (default size

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
Shadab_Farooque
Participant
Posts: 21
Joined: Tue Apr 24, 2007 12:39 am

Date conversion cache disabled due to overflow (default size

Post by Shadab_Farooque »

Hi,

I am loading 6 million records from sequential file to Oracle .
My Job gets aborted at 3 million.
The error message is "Date conversion cache disabled due to overflow (default size: 1000)".

Please let me know how to handle this .

Regards
Shadab_Farooque
Participant
Posts: 21
Joined: Tue Apr 24, 2007 12:39 am

Post by Shadab_Farooque »

Giving more details below

I have one Date column in my Job.The input date value in the txt file is like 2007-12-31.I am using StringToDate function to convert this into date format.The input txt file has 6 million records.
The Job gets aborted at 3 million throwing error as
"Date conversion cache disabled due to overflow (default size: 1000)".

Please help.
JoshGeorge
Participant
Posts: 612
Joined: Thu May 03, 2007 4:59 am
Location: Melbourne

Re: Date conversion cache disabled due to overflow (default

Post by JoshGeorge »

You are using sqlldr for loading and this is an error thrown by sqlldr.
Shadab_Farooque wrote:The error message is "Date conversion cache disabled due to overflow (default size: 1000)".
Joshy George
<a href="http://www.linkedin.com/in/joshygeorge1" ><img src="http://www.linkedin.com/img/webpromo/bt ... _80x15.gif" width="80" height="15" border="0"></a>
Shadab_Farooque
Participant
Posts: 21
Joined: Tue Apr 24, 2007 12:39 am

Re: Date conversion cache disabled due to overflow (default

Post by Shadab_Farooque »

I am Using Oracle Enterprise for loading the data.
It works as sqlloader in background.

How to increase the date cache and Errors allowed parameters in Datastage 7.5.1(PX)
Shadab Farooque
JoshGeorge
Participant
Posts: 612
Joined: Thu May 03, 2007 4:59 am
Location: Melbourne

Post by JoshGeorge »

Date cache is set to 1000 by default, you may want to try either increasing the size of the date cache or disabling the cache by specifying DATE_CACHE=0. Consider adding/changing your options in $APT_ORACLE_LOAD OPTIONS to match your requirement.

Date cache:
Max Size: 1000

Max Size is the maximum number of entries in the cache. By default, the date cache size is 1,000 elements, but it can be changed with the SQL*Loader DATE_CACHE command-line parameter.
Joshy George
<a href="http://www.linkedin.com/in/joshygeorge1" ><img src="http://www.linkedin.com/img/webpromo/bt ... _80x15.gif" width="80" height="15" border="0"></a>
Shadab_Farooque
Participant
Posts: 21
Joined: Tue Apr 24, 2007 12:39 am

Post by Shadab_Farooque »

Hi George,

Thanks for your input.

Please let me know how to make DATE_CACHE=0 in $APT_ORACLE_LOAD OPTIONS
Shadab Farooque
bkumar103
Participant
Posts: 214
Joined: Wed Jul 25, 2007 2:29 am
Location: Chennai

Post by bkumar103 »

why dont you use directly sqlldr to load the data after formating the input data. I didnt get any problem even loading 7 million of records.
siauchun84
Participant
Posts: 63
Joined: Mon Oct 20, 2008 12:01 am
Location: Malaysia

Post by siauchun84 »

bkumar103, how to use the direct sqlldr to load the data? I have 9 millions of records to load but it aborted at 6 million or 7 million point. Please help.
Post Reply