ORA-01555: snapshot too old

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

Dsnew wrote:I will implement this.
Do i have to check on Allow multiple instance?
How do you I give a name to an instance
Concatenating the sequential files after using the MOD function, will that not disturb the original sequence? Or does that not matter?
I have never done this before, can you please tell in detail..
1. Yes
2. Using a Sequencer or job control - just supply the instance name as Jobname.Instancename
3. What original sequence would that be? If you divide a source dataset into 5 round-robin assigned sets, there's no ordering. I suggest you reorder the data if necessary. If you have to process ordered data, it behooves you to partition in a round-robin sense that preserves ordering within partitioned groups. In other words, separate your source data into sets that keep "GROUPS" together, but still give a near-even distribution of data. You want to open more read-write processes between Oracle and DS to maximize data spooling, that's the goal.
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Dsnew
Participant
Posts: 90
Joined: Tue Apr 19, 2005 2:08 pm

Post by Dsnew »

Kenneth
Thanks for your advice and patience!
I wil go ahead and implement this method
Will get back to you if i have any issues
Sainath.Srinivasan
Participant
Posts: 3337
Joined: Mon Jan 17, 2005 4:49 am
Location: United Kingdom

Post by Sainath.Srinivasan »

Is the whole 40m rows for each load? Maybe you can consider breaking them in smaller delta loads.
narasimha
Charter Member
Charter Member
Posts: 1236
Joined: Fri Oct 22, 2004 8:59 am
Location: Staten Island, NY

Post by narasimha »

Sainath,

Break if up using the MOD in the Source as Ken suggested or do you have any other ideas?
The 40 million rows, is a one time load.
Narasimha Kade

Finding answers is simple, all you need to do is come up with the correct questions.
katz
Charter Member
Charter Member
Posts: 52
Joined: Thu Jan 20, 2005 8:13 am

Post by katz »

I agree with the previous suggestions to use the Bulk Loader in your scenario, and have only the following observation regarding the "ORA-01555: snapshot too old" error.

It sounds like you are experiencing what is known as "Fetch across commits within an open cursor " and in this situation... "Committing less often (which will result in larger rollback segments) will REDUCE the probability of getting the ORA-01555. Thus, there is a trade-off between disk space and success in cursor processing."

Just out of curiosity, what was the setting for "Rows per Transaction" on the Oracle stage doing the 40 million inserts?
narasimha
Charter Member
Charter Member
Posts: 1236
Joined: Fri Oct 22, 2004 8:59 am
Location: Staten Island, NY

Post by narasimha »

You want to know the row/sec for loading the 40 million records?
It was 5215 row/sec and it took around 2 and a half hours
Narasimha Kade

Finding answers is simple, all you need to do is come up with the correct questions.
katz
Charter Member
Charter Member
Posts: 52
Joined: Thu Jan 20, 2005 8:13 am

Post by katz »

Hi narasimha,

I am referring to the screen input item of the Oracle OCI Stage which is performing the insert operation. Look for an input box labelled "Rows per transaction" on the Transaction Handling tab, of the Input tab. This value determines the number of rows inserted before committing in Oracle.

The idea is that you can reduce the possiblity of getting the ORA-01555 error, by commiting less often (i.e. insert more rows before committing). In other words place a large value in the Rows per transaction.

You must balance the requirements for disk space that is required for holding the additional uncommitted inserts, and reducing the chance of hitting the ORA-01555 error.

I was curious if you had a low number (e.g. 1) inthe Rows per transaction, when you had the error.

This is only academic, as you are best advised to to the Bulk Loader method for this volume of input.

Best luck
narasimha
Charter Member
Charter Member
Posts: 1236
Joined: Fri Oct 22, 2004 8:59 am
Location: Staten Island, NY

Post by narasimha »

I am using the Bulk Load method, Getting good results :lol:
Yeah in my previous OCI method my Rows per transaction was '0', I had not changed that!! :oops:
Narasimha Kade

Finding answers is simple, all you need to do is come up with the correct questions.
Post Reply