Hi All,
I've got case where I dump a table from Oracle 8.1.7 using Oracle 10 client and Bigint columns get corrupted, the corruption causes a key column to apear in several rows in my output data (not entire duplicate rows just the key bigint column gets messed up) .
I've used sqlplus with both 8 and 10 (lib32) Oracle clients to dump the data but no duplicates for that key were found .
Also if I specify a where clause to get that specific key in my Oracle stage I only shows 1 row.
I've found this due to investigation of deadlocks occuring on job that performs upsert to a table (If ayone is interested).
i.e.
col1 bigint
col2 bigint
col3 char
col1 + col2 are pk
actual values
col1 col2 col3
111 222 abd
112 222 vdsv
Unloading to sequential file using DS parallel job gives
111 222 abd
111 222 vdsv
Anyone had this before
Any help will be helpfull
Thanks,
Data Corruption using Oracle stage
Moderators: chulett, rschirm, roy
Data Corruption using Oracle stage
Roy R.
Time is money but when you don't have money time is all you can afford.
Search before posting:)
Join the DataStagers team effort at:
http://www.worldcommunitygrid.org
Time is money but when you don't have money time is all you can afford.
Search before posting:)
Join the DataStagers team effort at:
http://www.worldcommunitygrid.org
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
The fields are number(10) in the DB.
Even using int as data type the same happens (current values are smaller then 10 digits)
This gets wierd by the hour
Using to_char on fields and I pass it to filter or copy stage then sequential file.
then I limit the job to 1000 rows I see 999 rows passed from the Oracle but only 100 after filter/copy stage to the sequential file
the filter condition is pure TRUE.
I'm trying ot get to hte bottom of things
Even using int as data type the same happens (current values are smaller then 10 digits)
This gets wierd by the hour
Using to_char on fields and I pass it to filter or copy stage then sequential file.
then I limit the job to 1000 rows I see 999 rows passed from the Oracle but only 100 after filter/copy stage to the sequential file
the filter condition is pure TRUE.
I'm trying ot get to hte bottom of things
Roy R.
Time is money but when you don't have money time is all you can afford.
Search before posting:)
Join the DataStagers team effort at:
http://www.worldcommunitygrid.org
Time is money but when you don't have money time is all you can afford.
Search before posting:)
Join the DataStagers team effort at:
http://www.worldcommunitygrid.org
Ok,
Some one put a force compile and trace + limited 100 rows per node , shewwwww
It still is an issue This only solves the mistery of output limit to 100 rows!!!
Some one put a force compile and trace + limited 100 rows per node , shewwwww
It still is an issue This only solves the mistery of output limit to 100 rows!!!
Last edited by roy on Thu Sep 15, 2005 6:40 am, edited 1 time in total.
Roy R.
Time is money but when you don't have money time is all you can afford.
Search before posting:)
Join the DataStagers team effort at:
http://www.worldcommunitygrid.org
Time is money but when you don't have money time is all you can afford.
Search before posting:)
Join the DataStagers team effort at:
http://www.worldcommunitygrid.org
I see you've allready solved it, but I still would like to give one hint:
You say you're using an Oracle 10 client on an 8 DB. We 've had a lot of problems with this and our oracle support said this is not supported. If you need to work on both DB's with one client(what I find somewhat of a lack in DS that this is needed), they advice to use a 9 client.(not 8 nor 10).
You say you're using an Oracle 10 client on an 8 DB. We 've had a lot of problems with this and our oracle support said this is not supported. If you need to work on both DB's with one client(what I find somewhat of a lack in DS that this is needed), they advice to use a 9 client.(not 8 nor 10).
Thanks,
I'll see if a 9 client is available.
I think they never had 9 installed .
I'll go back to the 8 client and test, if it works fine I'll simply wait till we go 10g in the DW to use it (hoping it will work fine with only 10g related)
I'll see if a 9 client is available.
I think they never had 9 installed .
I'll go back to the 8 client and test, if it works fine I'll simply wait till we go 10g in the DW to use it (hoping it will work fine with only 10g related)
Roy R.
Time is money but when you don't have money time is all you can afford.
Search before posting:)
Join the DataStagers team effort at:
http://www.worldcommunitygrid.org
Time is money but when you don't have money time is all you can afford.
Search before posting:)
Join the DataStagers team effort at:
http://www.worldcommunitygrid.org