Search found 246 matches
- Thu Oct 13, 2011 2:37 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: stage to get the max time for each date
- Replies: 11
- Views: 6015
stage to get the max time for each date
I have records coming in as duplicated and have same dates and different times. I also have a uniquekey personnumber which is unique for each person. Can anyone suggest me a stage where I can get the max time for that particular day and pick that single record and eliminate the records of other timi...
- Fri Oct 07, 2011 9:51 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Tranformer logic
- Replies: 5
- Views: 1416
Ok. I will try to check for each pass by comparing as the records are sorted. In the path field below Is there a function to find the node next to the node which is found. I mean lets say If the input record coming to the tramsformer Is X It needs to check the path records and find it and then next ...
- Fri Oct 07, 2011 8:27 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Tranformer logic
- Replies: 5
- Views: 1416
Actually if a node is expired its below level nodes are to be expired and then its below levels nodes till the end of the tree are expired. they are related with node , next_level(toplevel above the node) and path also which shoes the path(ex last path X/A/B If a expires then B and then nodes for wh...
- Thu Oct 06, 2011 8:22 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Tranformer logic
- Replies: 5
- Views: 1416
Tranformer logic
I have three fields node,node_level,path which has values like below node :A|G|J|H|B|I|C|E| next_level:X|X|G|G|A|G|A|B| Path:/X|/X|/X/G|/X/G|/X/A|/X/G|/X/A|/X/A/B| ignore the pipe(|) in between the records for the three fields and the rest are the records in the table. If lets say the next_level exp...
- Tue Sep 27, 2011 10:39 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: commit after each row
- Replies: 6
- Views: 1768
This is a typical typeII job where we capture changes and update the records. But before CDC we need to send the records with the parent key which we get after doing lookup with the target. If an other job is developed in the above kind of scenario we cannot compare the keys and also it may not pick...
- Tue Sep 27, 2011 8:38 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: commit after each row
- Replies: 6
- Views: 1768
This field cannot be populated into the target before loading the parent record. In this scenario the records from the source enter as each group which form unique with hiername and effdate. Like first 5 records belong to one group and next 4 records next group and 6 records third group and so on......
- Tue Sep 27, 2011 8:54 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: commit after each row
- Replies: 6
- Views: 1768
I tried to do sparse lookup with the target to get the primary key value which is to be loaded but that did not help? I need to load a record and do lookup with the record and load second record and then lookup second record........ Also committed after 1 row in the target but still the field is not...
- Mon Sep 26, 2011 12:04 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: commit after each row
- Replies: 6
- Views: 1768
commit after each row
In my requirement a table is loaded with a primary key and in the middle of the job this primary key is taken as another field to get loaded. I need to do lookup with the target to get this primary key and use this to load another field. Please suggest how can I do this with commit each row after lo...
- Tue Sep 13, 2011 2:32 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: sequential file warning
- Replies: 4
- Views: 2712
- Tue Sep 13, 2011 12:32 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: records dropping in transformer!!
- Replies: 3
- Views: 2110
- Tue Sep 13, 2011 12:24 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: records dropping in transformer!!
- Replies: 3
- Views: 2110
- Tue Sep 13, 2011 11:49 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: records dropping in transformer!!
- Replies: 3
- Views: 2110
records dropping in transformer!!
There were around 5 jobs which has no dropped records and all the fileds have been given NVL in the source query. All the jobs were running fine. All of a sudden the records are getting dropped in the transformer stage which is between two lookups for all the 5 jobs. please suggest me if I need to c...
- Tue Sep 13, 2011 8:28 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: sequential file warning
- Replies: 4
- Views: 2712
- Mon Sep 12, 2011 1:41 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: sequential file warning
- Replies: 4
- Views: 2712
sequential file warning
I have a warning from sequential stage which is capturing rejects from table.
Invalid character conversion found converting to ISO_8859-1:1987, substituting.
please suggest if any changes are required?
Thanks
Invalid character conversion found converting to ISO_8859-1:1987, substituting.
please suggest if any changes are required?
Thanks
- Thu Sep 08, 2011 11:50 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: sequential file warning
- Replies: 2
- Views: 1256