Hi chulett, thanks for replying
now i gave as
If DSLink2.deptno = Lastvalue then 1 else 0 = Isdup
DSLink2.deptno= Lastvalue
constraint : Isdup=0
I could not remove duplicate values. Could you Please explain me the logic? Do I have to set any defualt value to Lastvalue?
Thanks
Search found 25 matches
- Sun Feb 13, 2011 9:17 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: compare values in transformer stage
- Replies: 27
- Views: 14784
- Sun Feb 13, 2011 7:17 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: compare values in transformer stage
- Replies: 27
- Views: 14784
- Sun Feb 13, 2011 6:53 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: compare values in transformer stage
- Replies: 27
- Views: 14784
- Sun Feb 13, 2011 6:25 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: compare values in transformer stage
- Replies: 27
- Views: 14784
compare values in transformer stage
Hi,
how can i remove duplicates in transformer stage using stage variables?
I have only one column in my source,
payment ID
100
100
101
103
102
101
104
after sorting and hash partitioning in transformer stage, Could you please explain me the logic?
how can i remove duplicates in transformer stage using stage variables?
I have only one column in my source,
payment ID
100
100
101
103
102
101
104
after sorting and hash partitioning in transformer stage, Could you please explain me the logic?
- Wed Feb 09, 2011 4:08 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: how data loads in db2 stage after restarting
- Replies: 6
- Views: 2053
- Wed Feb 09, 2011 3:13 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: how data loads in db2 stage after restarting
- Replies: 6
- Views: 2053
how data loads in db2 stage after restarting
Hi, I have a job which writes 1000 records to DB2 Database. my question is, after writing 500 records, job aborts due to connectivity issues. how will i make sure that after restrating, it will write from 501 record? Dataset ------>DB2 please expain me this? what are the options i have to use in the...
- Tue Feb 08, 2011 12:51 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Aggregation in Transformation Stage
- Replies: 4
- Views: 1878
- Tue Feb 08, 2011 12:22 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Aggregation in Transformation Stage
- Replies: 4
- Views: 1878
- Tue Feb 08, 2011 12:19 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Aggregation in Transformation Stage
- Replies: 4
- Views: 1878
Aggregation in Transformation Stage
Hi,
Could you please let me know how to perform aggregation in transformation stage.
Source data
Deptno,sal
10,200
20,300
30,400
20,300
10,300
30,400
I want to aggregate data by deptno wise in tranformation stage? How can i acheive this?
Could you please let me know how to perform aggregation in transformation stage.
Source data
Deptno,sal
10,200
20,300
30,400
20,300
10,300
30,400
I want to aggregate data by deptno wise in tranformation stage? How can i acheive this?
- Tue Feb 08, 2011 12:16 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: source records of different bytes to target db with same byt
- Replies: 1
- Views: 961
source records of different bytes to target db with same byt
Hi All,
I was asked a question this morning,
If we have source records of different lengths like 200bytes, 300bytes, 400bytes. How can I load it into target database where each record will be 500bytes in size?
Please let me know
Thank you
I was asked a question this morning,
If we have source records of different lengths like 200bytes, 300bytes, 400bytes. How can I load it into target database where each record will be 500bytes in size?
Please let me know
Thank you