Search found 32 matches

by adityavinay
Tue Apr 09, 2013 8:03 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Order of Columns in Vertical Pivot
Replies: 4
Views: 1590

Dont do any partitioning. Let it be Auto. Just follow as Ray Suggested. Run your Pivot stage in sequential. It should give the results as expected.
by adityavinay
Sat Apr 06, 2013 5:12 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: logic required
Replies: 20
Views: 7591

hargun wrote:This is not working, it will not increment the Group for different Phone number.
I've tested it and its working fine as per your requirement. Make sure you are sorting and creating key change column only on Phone number.
by adityavinay
Fri Apr 05, 2013 2:11 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Teradata Variant
Replies: 4
Views: 2945

As explained by kcirtap24 , variant 12 is used to load teradata utilities for TTU 12 and higher. These utilities are used when performing Bulk loads.
For normal loads, it doesn't matter what variant you use.
by adityavinay
Fri Apr 05, 2013 1:49 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: logic required
Replies: 20
Views: 7591

hargun, If you are using version 8.5 or latest, try this way. File -> sort stage-> trans->file Create Key change column in sort stage , sort ascending on phone number. In transformer, create two stage variables sv1: LastRowInGroup(Phone number) - This will return 1 if it is last row in group sv2:if ...
by adityavinay
Wed Mar 27, 2013 6:41 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Teradata Connector bulkload
Replies: 5
Views: 4517

Ray,
As you mentioned, Load works as fastload , usually Fast load requires an empty target table to load.
If table action is "append " & Load type is "load"
For the next run target table will contain data from first run. how does it work in that case?

Thanks,
Aditya.
by adityavinay
Tue Mar 26, 2013 2:01 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Teradata Connector bulkload
Replies: 5
Views: 4517

Teradata Connector bulkload

Hi, I am using bulk mode in teradata connector stage for inserting Huge data into target tables. My question is , Does the bulk load behaves based on target table? i mean 1)if target table is empty, does the bulk mode assumes it as fastload ? 2) if target table have data in it does it assume as mult...
by adityavinay
Thu Jan 17, 2013 1:49 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: identify descriptor file of datafile
Replies: 2
Views: 1056

Thanks for the quick response. It seems like the link you provided should have special access.
Following is what i get when i open that link
"Sorry, but only users granted special access can read topics in this forum."
by adityavinay
Thu Jan 17, 2013 1:21 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: identify descriptor file of datafile
Replies: 2
Views: 1056

identify descriptor file of datafile

Hi ,

I would like to know, if we can identify orphaned DataSet data files from the resource disk. Could you please let me know if there is any command to do so?

Thanks in Advance.
Aditya.
by adityavinay
Thu Oct 04, 2012 2:31 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Teradata warning from character set to ISO-8859-1may affect
Replies: 4
Views: 3525

Williams, Changing the NLS @job/project level will resolve the issue. Do you have any idea on why that change is required when both represents same charecter sets. According to IANA(http://www.iana.org/assignments/character-sets) ISO_8859-1:1987 is Alias to ISO-8859-1 Name: ISO_8859-1:1987 [RFC1345,...
by adityavinay
Thu Jul 26, 2012 7:55 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Not In clause in Filter stage...
Replies: 8
Views: 15272

I have already tried that and it did not work...I remember in DataStage you have to use OR for NOT IN conditions... I used that syntax(AND) , it is working. input col1 , col2, col3 1,10,2 2,20,2 3,30,2 4,40,4 filter condition col2 <> 10 And col2 <> 20 And Col3 <=2 output col1 , col2, col3 3,30,2
by adityavinay
Wed Jul 25, 2012 12:44 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Not In clause in Filter stage...
Replies: 8
Views: 15272

Re: Not In clause in Filter stage...

Use this syntax (use AND instead of OR)
(Field1 <> Val1 and Field1 <> Val2 and Field1 <> Val3) and Field2 <= Val4
by adityavinay
Thu Jul 19, 2012 10:08 am
Forum: General
Topic: user variable activity
Replies: 2
Views: 2225

Re: user variable activity

Found the solution in publib.boulder.ibm.com
Using Oconv function this will be achieved.
syntax = Oconv("outputvalue", "MCN")
by adityavinay
Thu Jul 19, 2012 9:59 am
Forum: General
Topic: user variable activity
Replies: 2
Views: 2225

user variable activity

Can some one help me on how to convert string to integer in uservariable activity in sequencer. I have a value in file , using 'Execute command Activity' i am reading the value and making it as parameter using 'User Variable Activity'. This parameter will be used in start loop. Is it possible to con...
by adityavinay
Mon May 07, 2012 1:42 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: How to implement the logic in tranformer stage
Replies: 1
Views: 904

Re: How to implement the logic in tranformer stage

If (OBLT.OBL_COL_TT <= 99 and OBLT.OBL_COL_TT >= 0) Then "0" else (If (char(OBLT.OBL_COL_TT)[1,1] = '2' and Trim(USER_DATA.USER_CD_2) ='A') then "N" else " whateveryourreqis") If (OBLT.OBL_COL_TT =200 or OBLT.OBL_COL_TT =250) then "AH" else "AG"
by adityavinay
Fri Apr 27, 2012 8:17 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Unicode conversion failed
Replies: 4
Views: 3658

Re: Unicode conversion failed

I faced the same issue but with teradata. I changed NLS to ASCL_ISO8859-1 instead of (UTF-8) and it worked...