Data Corruption using Oracle stage

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
roy
Participant
Posts: 2598
Joined: Wed Jul 30, 2003 2:05 am
Location: Israel

Data Corruption using Oracle stage

Post by roy »

Hi All,
I've got case where I dump a table from Oracle 8.1.7 using Oracle 10 client and Bigint columns get corrupted, the corruption causes a key column to apear in several rows in my output data (not entire duplicate rows just the key bigint column gets messed up) :shock: .
I've used sqlplus with both 8 and 10 (lib32) Oracle clients to dump the data but no duplicates for that key were found :( .
Also if I specify a where clause to get that specific key in my Oracle stage I only shows 1 row.

I've found this due to investigation of deadlocks occuring on job that performs upsert to a table (If ayone is interested).

i.e.
col1 bigint
col2 bigint
col3 char

col1 + col2 are pk

actual values
col1 col2 col3
111 222 abd
112 222 vdsv

Unloading to sequential file using DS parallel job gives
111 222 abd
111 222 vdsv


Anyone had this before :?:
Any help will be helpfull :!:

Thanks,
Roy R.
Time is money but when you don't have money time is all you can afford.

Search before posting:)

Join the DataStagers team effort at:
http://www.worldcommunitygrid.org
Image
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

What are the Oracle data types for col1 and col2?
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
roy
Participant
Posts: 2598
Joined: Wed Jul 30, 2003 2:05 am
Location: Israel

Post by roy »

The fields are number(10) in the DB.
Even using int as data type the same happens (current values are smaller then 10 digits)

This gets wierd by the hour :twisted:
Using to_char on fields and I pass it to filter or copy stage then sequential file.
then I limit the job to 1000 rows I see 999 rows passed from the Oracle but only 100 after filter/copy stage to the sequential file :shock:
the filter condition is pure TRUE.

I'm trying ot get to hte bottom of things :roll:
Roy R.
Time is money but when you don't have money time is all you can afford.

Search before posting:)

Join the DataStagers team effort at:
http://www.worldcommunitygrid.org
Image
roy
Participant
Posts: 2598
Joined: Wed Jul 30, 2003 2:05 am
Location: Israel

Post by roy »

Ok,
Some one put a force compile and trace + limited 100 rows per node , shewwwww :)

It still is an issue This only solves the mistery of output limit to 100 rows!!!
Last edited by roy on Thu Sep 15, 2005 6:40 am, edited 1 time in total.
Roy R.
Time is money but when you don't have money time is all you can afford.

Search before posting:)

Join the DataStagers team effort at:
http://www.worldcommunitygrid.org
Image
jasper
Participant
Posts: 111
Joined: Mon May 06, 2002 1:25 am
Location: Belgium

Post by jasper »

I see you've allready solved it, but I still would like to give one hint:
You say you're using an Oracle 10 client on an 8 DB. We 've had a lot of problems with this and our oracle support said this is not supported. If you need to work on both DB's with one client(what I find somewhat of a lack in DS that this is needed), they advice to use a 9 client.(not 8 nor 10).
roy
Participant
Posts: 2598
Joined: Wed Jul 30, 2003 2:05 am
Location: Israel

Post by roy »

Thanks,
I'll see if a 9 client is available.
I think they never had 9 installed :(.

I'll go back to the 8 client and test, if it works fine I'll simply wait till we go 10g in the DW to use it (hoping it will work fine with only 10g related)
Roy R.
Time is money but when you don't have money time is all you can afford.

Search before posting:)

Join the DataStagers team effort at:
http://www.worldcommunitygrid.org
Image
Post Reply