table structure differ from actual table in oracle

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
balu124
Participant
Posts: 49
Joined: Wed Jul 25, 2007 9:53 pm

table structure differ from actual table in oracle

Post by balu124 »

Hi ,
In oracle the data type for a column in a table is integer.I imported the table in to datastage repository by assisted approach .

When I tried to define the columns for that table in source stage ,the data type of integer column appearing as decimal.and same case with date (original) and timestamp(converted in datastage).

when we execute the job keeping the same(not converting) the job gets aborted and we got the following error

implicit conversion not possible from timestamp to date,implicit conversion is not possible integer to decimal.

how to solve this?
Maveric
Participant
Posts: 388
Joined: Tue Mar 13, 2007 1:28 am

Post by Maveric »

Did you try running the job by changing the data types to integer and date respectively?
vmcburney
Participant
Posts: 3593
Joined: Thu Jan 23, 2003 5:25 pm
Location: Australia, Melbourne
Contact:

Post by vmcburney »

You could try importing the table definition using the Import "Orchestrate Schema Definition" instead of the Plug-in importer. It might give you more accurate metadata for a parallel job. It's a bit trickier to use but it can log into Oracle to get metadata and creates a normal table definition that you can use in a job.
balu124
Participant
Posts: 49
Joined: Wed Jul 25, 2007 9:53 pm

Post by balu124 »

Maveric wrote:Did you try running the job by changing the data types to integer and date respectively?
Hi Maveric,
when we change the target metadata and run the job we got the errors like this numaric value ou of range
balu124
Participant
Posts: 49
Joined: Wed Jul 25, 2007 9:53 pm

Post by balu124 »

Maveric wrote:Did you try running the job by changing the data types to integer and date respectively?
Hi Maveric,
when we change the target metadata and run the job we got the errors like this numaric value ou of range even we define the metadata correctly
balu124
Participant
Posts: 49
Joined: Wed Jul 25, 2007 9:53 pm

Post by balu124 »

vmcburney wrote:You could try importing the table definition using the Import "Orchestrate Schema Definition" instead of the Plug-in importer. It might give you more accurate metadata for a parallel job. It's a bit ...
Hi vmcburney,
Could you please explain me the steps in importing the oracle tables using "Orchestrate Schema Definition" .

I imported using odbc table definitions.

and also please let me know dsn is separetly defined for this as I defined the older dsn using odbc drivers.
Post Reply