Page 1 of 1

Date columns in parallel job behaving peculiarly

Posted: Tue Jun 08, 2010 3:35 am
by srilaxmi
Hi,
I have a parallel job that is reading data from SQL server table and loading into a oracle table. There are 7 to 8 date columns in both source and target tables.
Transformer is just a plain one to one mapping.
The problem is whenever I run this job in production environment, one or the other date columns is populated as null values, even though the source SQL table has data for all the date columns.
This job completes successfully without throwing any warnings and when I verify data by querying database I see nulls.
When I run this job from development env it works fine and all date columns show data.
Can somebody help me solve this issue?
Thanks,
Srilaxmi.

Posted: Tue Jun 08, 2010 3:49 am
by ArndW
I'm not sure I follow your description corerctly, but would recommend adding a second output from your transform stage to a peek stage so that you can determine if the erroneous null values happen when reading from SQL server into DataStage or when writing to Oracle.

Re: Date columns in parallel job behaving peculiarly

Posted: Tue Jun 08, 2010 12:13 pm
by vinnz
srilaxmi wrote:The problem is whenever I run this job in production environment, one or the other date columns is populated as null values, even though the source SQL table has data for all the date columns.
What are some of the values being dropped? Is the complete record being dropped or just the column value? Does your job work properly on your Development environment if the same values are in the source table?

Posted: Tue Jun 08, 2010 10:12 pm
by srilaxmi
Any one of the date column(complete column for all records) is being populated as NULL, for each run in production.

I have executed the same job in Develpement with the same source table and all the date columns populated properly.

Posted: Tue Jun 08, 2010 10:37 pm
by ray.wurlod
Any one or every one? Randomly/intermittently, for every row, or for something for which you can discern a pattern? Are your dates using the default date format?

That's just a few of the things that a systematic diagnosis might investigate.

Posted: Wed Jun 09, 2010 1:46 pm
by vinnz
Any metadata differences between the column(s) being dropped and the ones that are not? Are the rest of the columns getting the right values or are they getting skewed/mis-mapped? Do you have any message handlers defined in your production environment?

Posted: Fri Jun 11, 2010 7:32 am
by vkhandel
I would suggest you to read DATE columns as VARCHAR (SQL Server stores date values in default format "%dd-%mm-%yyyy"), convert them in date using transformer and then load in ORACLE.

Posted: Fri Jun 11, 2010 8:30 am
by Sreenivasulu
What 'vkhandel' says is apt. You need to take all data columns as varchar.
Since the date format setting in Database and Datastage could be different.


Regards
Sreeni

Posted: Fri Jun 11, 2010 11:08 am
by priyadarshikunal
you should not be worried about date format in different databases and also you don't have to read it as varchar.

Check the warnings in that job.

are you using basic transformer in the job?