Search found 353 matches

by chandra.shekhar@tcs.com
Tue Jan 24, 2012 2:34 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: facing problem using Routine
Replies: 1
Views: 1089

facing problem using Routine

Hi, I have a simple job which calls a routine. The routine is of chartobits function which takes string as an input and gives bits as output. Now the job gets aborts giving the following error : Transformer_2: Failed to load the library "V0S2_chartobits_Transformer_2.o"; either the directo...
by chandra.shekhar@tcs.com
Tue Jan 24, 2012 2:27 am
Forum: General
Topic: BASIC query
Replies: 10
Views: 2165

My bad, I thought mdbatra does not want any header rows. :oops:
by chandra.shekhar@tcs.com
Mon Jan 23, 2012 11:57 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: SQL Problem in Datastage
Replies: 8
Views: 5367

@Sambit
How can u get the SQL query in your job logs?
I am unable to see anything related to the query in my job logs. :(
by chandra.shekhar@tcs.com
Mon Jan 23, 2012 11:48 pm
Forum: General
Topic: BASIC query
Replies: 10
Views: 2165

Set First Line is Column Name to False.
The header record will get automatically rejected.
by chandra.shekhar@tcs.com
Thu Jan 19, 2012 11:38 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Delete rows from empty table
Replies: 6
Views: 2109

Sainath, I think you need not to worry about that. It is just a warning that while deleting the rows from table it didnt find any. Even when you delete rows from a table in a database, it will again give this warning. And if you dont want to see this warning in the director, just use message handler...
by chandra.shekhar@tcs.com
Wed Jan 18, 2012 7:47 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Capture Duplicates
Replies: 9
Views: 4020

Thanx Kumar and Pandeesh
That worked!! :lol:
by chandra.shekhar@tcs.com
Wed Jan 18, 2012 7:08 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Capture Duplicates
Replies: 9
Views: 4020

@Pandeesh
Yes I want to keep all the duplicate records in 1 File and Unique, Single Id records in other File.
By using Aggregator and then Filter, I'll have all unique ID records(even from duplicates) and that I dont want.

@James
Nops, I am using version 8.1 :(
by chandra.shekhar@tcs.com
Wed Jan 18, 2012 5:04 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Capture Duplicates
Replies: 9
Views: 4020

Capture Duplicates

Hi, I know that this topic has been discussed a lot of times but I didnt found one according to my requirement. My data looks like ID CODE 1 A 1 B 1 C 2 D 3 E 4 F 4 G Here I am having multiple ID's. Now I want two outputs, the first one will have records of unique and of all single ID's(here 2 and 3...
by chandra.shekhar@tcs.com
Wed Jan 18, 2012 2:05 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: How to handle this logic?
Replies: 21
Views: 8941

Yeah Yeah.... I got it now.
That was a silly thing which I forgot.
Now I can run my job on multiple nodes and the logic is also working fine.
Thanks guys for your efforts. :)
by chandra.shekhar@tcs.com
Wed Jan 18, 2012 1:28 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: How to handle this logic?
Replies: 21
Views: 8941

@krypton
ACCT_NO and REC_NO are part of primary key. So in the transformer how can I explicitly mention hash partitioning on ACCT_NO only.
by chandra.shekhar@tcs.com
Tue Jan 17, 2012 4:25 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: How to handle this logic?
Replies: 21
Views: 8941

Madhav, Thanks for your effort. This is what I have done. svOldAcc ------ ACCT_NO svNewRec--------REC_NO svCount----------If (svOldAcc = svNewAcc and svOldRec = svNewRec -1) then svCount + 1 else 0 svDscrpt---------If (svOldAcc = svNewAcc and svOldRec = svNewRec -1) then svDscrpt : ' ' : trim(Descri...
by chandra.shekhar@tcs.com
Tue Jan 17, 2012 12:08 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: How to handle this logic?
Replies: 21
Views: 8941

@Madhav
Added another output from the tranformer that pulled across the concatenated desc value..
How??

@Krypton
Thats what happening with me. When I use hash partitioning in the same transformer(Not using Sort stage) the above logic works for some records only and not for all :cry:
by chandra.shekhar@tcs.com
Mon Jan 16, 2012 8:15 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: How to handle this logic?
Replies: 21
Views: 8941

But according to my requirement, I want the record when the diff. is 1 between consecutive rec_no's.
This constraint will not work correctly.

Code: Select all

If svPrevKey <> svPresKey Then 0 Else If svPrevKey = svPresKey And svRecNoDiff <>1 Then 0 Else 1