I am trying to join data from two tables using a Join stage and 2 odbc stages for Oracle.
The job is hanging giving the following warning:
ODBC_Enterprise_10: When checking operator: When binding output interface field "ADDRESS_ID" to field "ADDRESS_ID": Implicit conversion from source type "dfloat" to result type "decimal[38,10]": Possible range/precision limitation.
.
However I looked up ADDRESS_ID Data type in Toad it says NUMBER but when I import it to Datastage the datatype gets converted to Decimal
automatically.
Not sure if job is hanging for this reason but I cannot see any more messages in the Director log , I have manually stop the job. Any idea of whats happening please share
kumar_s wrote:Are you sure about the fact that not even a single record goes beyond 38,10?
I am importing metadata using odbc dsns's , I am not sure about data but those are all id's and they are defined as NUMBERS as seen from toad , I am not sure it would correspond to double or dfloat ..
It was just an observation. The first time I encountered this error message that was the cause. When next someone is searching I had hoped to cover the extra possibility.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Just curious - where did the [38,10] come from? A NUMBER in Oracle, one with no precision specified, is the same as saying NUMBER(38). There's no scale associated with it.
I'd suggest trying [38] or maybe [38,0]. Not familiar enough with PX to get any more specific than that.
Some stages over on the Server side have a quirk where you have to define the metadata to be '40' rather than '38' to stop odd errors about data truncation, from what I recall. Something to keep in mind if it becomes an issue over here.
-craig
"You can never have too many knives" -- Logan Nine Fingers
Yeah its just a NUMBER without any precision, when I import metadata its automatically takes (38,10) I tried to change the data type to Integer and lenght to 38 . But job still hangs, rows are coming from odbc stage but from the other odbc stage no rows are coming and hanging .
Is there any preprocessing to be done before using JOIN /LOOKUP stage?
One of the ODBC source might have duplicate records on joining jey, but the rows from that ODBC source are coming alrite.
Yes there are some pre processing required, something like sorting, de-duplication.... but the error you got is not relevent to that.
Just try to make that particular field to dfloat.
Impossible doesn't mean 'it is not possible' actually means... 'NOBODY HAS DONE IT SO FAR'
A few observations, using the Oracle Enterprise stage rather than ODBC stage, that may help:
Defining a NUMBER with no precision in Oracle is not the same as saying NUMBER(38) or NUMBER(38,0). Only the NUMBER will accept a decimal place. The other two are constrained to hold only integer values.
The metadata for an Oracle number or number(38) defaults to Decimal(38,10). DataStage PX treats this as a floating point decimal and provides a warning that it 'is not fully supported; adjusting scale' when read from Oracle.
In PX an Integer of length 38 is not possible - at least not on any flavour of Unix I have used. Remember that there are different types of integer that should be chosen for the best match to size - TinyInt, Integer and BigInt. From recollection an Integer only supports about 9 digits (2 to the power 32) and a BigInt 19 digits. Bigger than that and you need decimals, floats or doubles.
As you are not sure what is causing the job to hang, try separating the Oracle stages from the rest of the job. Write each one to a dataset first and see if that works.
Are you getting any data throughput showing at all?
thompsonp wrote:
Are you getting any data throughput showing at all?
I am getting data from one of the JOIN sets but not from the other.
I have sorted and hash partitioned both sources. .
I got this fatal error message:
Innerjoin_Addrrid.InAddress_Sort,0: Failure during execution of operator logic.
I finally resorted to writing user defined sql which works fine but the data
looks garbled with all junk characters , I think its got something to do with
the NLS , which btw is not installed in our server.
[/quote]
So I guess you have few many issue to lookup on.
As observed by thompsonp, The metadata for an Oracle number or number 38 defaults to Decimal(38,10). might be a valid information.
I am getting data from one of the JOIN sets but not from the other.
You mean input dataset is not getting read?
data looks garbled with all junk characters
In input or in output table?
Impossible doesn't mean 'it is not possible' actually means... 'NOBODY HAS DONE IT SO FAR'
I am getting data from one of the JOIN sets but not from the other.
You mean input dataset is not getting read?
Yes, from one of the inputs (odbc source though).
data looks garbled with all junk characters
In input or in output table?.
In both input and output as well. Input looks like all ????? while output has all wierd characters, output is a flat file. I think that is a NLS problem as NLS is not installed in our server.
But no data is passed from Join stage. I tried to put some peek stages and copy stages before and after Join stage, data actually flows through that odbc source even though it displays 0 rows per sec. But peekstage after Join stage doesnot pass any rows.
0 rows/sec is not the same as 0 rows. These rates are rounded to the nearest whole number, which means that the link finished in less than 0.5 sec. Another reason I believe that rows/sec is not a useful metric.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.