Search found 135 matches

by srinivas.nettalam
Tue Feb 21, 2012 9:02 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Setting Null to a nullable decimal column
Replies: 5
Views: 3946

thanks all for your replies..I am getting NULLs in database.I was confident enough but the counters were many.Hence I posted but yes it is my fault to post it before testing on my own
by srinivas.nettalam
Tue Feb 21, 2012 8:14 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Setting Null to a nullable decimal column
Replies: 5
Views: 3946

Setting Null to a nullable decimal column

Is that true that if we use SetNull() to a nullable decimal field ,it goes as 0.00 in to database???I really felt wierd when most of the people said that.
by srinivas.nettalam
Mon Dec 12, 2011 6:39 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: nulls in value columns(CDC)
Replies: 6
Views: 5704

nulls in value columns(CDC)

Hi,
Would someone let me know how does CDC work in case where both before and after links have nulls in change value column for an existing key value?Will it be an update or copy??
by srinivas.nettalam
Tue Sep 27, 2011 2:02 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Change Capture stage warning
Replies: 14
Views: 7704

Yes Ray.
I am mapping that column to the output link since I have to update the records in the target table with the value coming from after dataset.Only change is not captured based on that value.Is there anyway to eliminate those warning through someoption in CDC stage itself?
by srinivas.nettalam
Tue Sep 27, 2011 12:12 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Change Capture stage warning
Replies: 14
Views: 7704

yes still the warning were there.
by srinivas.nettalam
Mon Sep 26, 2011 11:57 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Change Capture stage warning
Replies: 14
Views: 7704

Is that the only way?I couldn't understand the reason behind this warning.
by srinivas.nettalam
Mon Sep 26, 2011 11:48 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Change Capture stage warning
Replies: 14
Views: 7704

Yes,If no otherway is found.
by srinivas.nettalam
Mon Sep 26, 2011 11:37 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Change Capture stage warning
Replies: 14
Views: 7704

Change Capture stage warning

Hi All, I am getting the warning below and warnings are not allowed in our project.I am using the column mentioned in the warning in both "after" and "before" datasets but excluded that from being a value column and it is present in the output column for the updates on the tables...
by srinivas.nettalam
Wed Aug 10, 2011 6:17 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: multiple update statements in oracle enterprise stages?
Replies: 1
Views: 2299

multiple update statements in oracle enterprise stages?

I am writing mulitple update statements in oracle enterprise stage(target).But i am getting an error " main_program: (aptoci.C:456). Message: ORA-00911: invalid character" and the job is aborting.The similar query worked to load SQL Server tables but for oracle it is not working.Would some...
by srinivas.nettalam
Sun Jun 12, 2011 11:40 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Dataset created on only one node
Replies: 3
Views: 1610

Is auto partitioning disabled in your environment? $APT_NO_PART_INSERTION=1 Or, the copy stage was probably optimized out by the engine at submission. In that case, probably no partitioner was inserted in front of the dataset stage when the job ran and therefore the data was not repartitioned. You ...
by srinivas.nettalam
Thu Jun 09, 2011 5:43 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Dataset created on only one node
Replies: 3
Views: 1610

Dataset created on only one node

I have Seq.file as the source then copy stage and a dataset.The partition in both copy and dataset is "Auto",I observed the dataset is created on only 1 node though the job ran on 4 nodes.I assumed that copy stage invokes round robin by default and the records would be distributed among th...
by srinivas.nettalam
Fri May 20, 2011 4:15 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Question on Sequential file stage parsing
Replies: 3
Views: 2661

Question on Sequential file stage parsing

The input record in a seq. file is ABCDEFGH .can we get the output as A B C D E F G H? One of my colleagues asked me this and I replied that we can't.Probably it is one of the interview questions but is it really possible? If the record length is always same then we can pivot(horizontal) ,but the re...
by srinivas.nettalam
Wed May 18, 2011 12:05 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: skip header info from binary file
Replies: 4
Views: 2430

The header gets rejected from the sequential file stage but the job is aborting saying there is import error in the second record which is actually having proper data
by srinivas.nettalam
Tue May 17, 2011 4:11 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: skip header info from binary file
Replies: 4
Views: 2430

skip header info from binary file

Hi ,
We are receiving fixed width binary files from mainframe environment which contains header info in the first line.we have to skip the first row and load the others.Please suggest a way to do it.
I tried by sending the first record to reject but the job is aborting
by srinivas.nettalam
Tue May 10, 2011 11:31 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: surrogate key generator generating improper sequence
Replies: 7
Views: 2928

I changed the File block size to 1 and attained the desired output.
Thank You All