Search found 1099 matches

by kris007
Tue Aug 03, 2010 10:04 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Pick a particular duplicate record
Replies: 24
Views: 9215

dsusersaj wrote:I need to capture the duplicate rows too. SO I think I should use sort stage with keychange.
You didn't mention that in your original post. As you mentioned you can do it using the Sort stage with Create Key Change Column set to True and capturing the required records downstream.
by kris007
Tue Aug 03, 2010 9:44 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Pick a particular duplicate record
Replies: 24
Views: 9215

Well, that's exactly the design I mentioned would do. In the Transformer Stage create the Dummy column similarly as mentioned earlier. If Field2 = '0001' Then 1 Else 0 In the Remove Duplicates Stage, mention Field 1 as the Key on the Stage properties tab. But in the Input >> Partitioning tab after s...
by kris007
Tue Aug 03, 2010 9:07 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Pick a particular duplicate record
Replies: 24
Views: 9215

In my mind this is still a simple design unless I am not understanding your requirement completely. I looked at your first post and I don't think the example you provided correctly reflects the issue you have. Can you be more specific with your example, lets say with about 4-5 sample records and how...
by kris007
Tue Aug 03, 2010 8:47 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Pick a particular duplicate record
Replies: 24
Views: 9215

Nope. You have to define your dummy column in a way that you can identify the values you need drop when they have duplicates. Something like If Field = '0001' Then 1 Else 2 In the Remove Duplicates, while you hash partition and sort the data on the keys, you need to just sort the data on the Dummy c...
by kris007
Tue Aug 03, 2010 8:24 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Reg: Reference data size for lookup stage.
Replies: 7
Views: 3775

Can you post the exact error message? The LookUp Stage can definitely handle 4 million records.
by kris007
Tue Aug 03, 2010 7:17 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Unable to connect the database
Replies: 21
Views: 8839

Are you using the same user id to connect to database via datastage and command line. This is a simple permissions issue where you either provided a wrong user name or password or both or the password got locked.
by kris007
Mon Aug 02, 2010 9:41 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: surrogate key logic not working on 4 node
Replies: 9
Views: 4539

What you are seeing is the data from the first partition/node in your view data result set. Just for the sake of it, after the transformer stage put a sort stage and sort your data on the key and then try viewing the data. That should make you feel better :)
by kris007
Mon Aug 02, 2010 3:03 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Pick a particular duplicate record
Replies: 24
Views: 9215

Source --> Transformer --> Remove Duplicate --> Target Set the constraint in transformer to Field2 <> "0001" Use Remove duplicates as mentioned above. If "0001' is not always a constant value, you can always create a dummy column something like a flag you can set to based upon the in...
by kris007
Mon Aug 02, 2010 1:33 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Sparse Lookup and Normal Lookup
Replies: 3
Views: 2285

Parallel Job Developer's Guide is a good place too. If you still have trouble understanding anything we are here to help you. :D
by kris007
Sun Aug 01, 2010 2:24 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Inserting the data into db2 using db2 connector stage
Replies: 5
Views: 5473

Well...the error message says it all. You are trying to insert NULL values into a NOT NULLABLE field on the table. You either have to check your data or assign default values when the value is NULL.
by kris007
Sat Jul 31, 2010 5:12 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: how to solve this
Replies: 2
Views: 1660

You can use an aggregator to group by on empno, deptno and get the max(sal). You can use a Remove Duplicates Stage with empno, deptno as the keys and you can sort the data on sal and retain the value you want.
by kris007
Sat Jul 31, 2010 5:07 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Inserting the data into db2 using db2 connector stage
Replies: 5
Views: 5473

Can you post the complete error message from the log including the SQL codes. Looks like a permissions issue to me. If you can, please post the generated SQL as well.
by kris007
Fri Jul 30, 2010 4:37 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Inserting the data into db2 using db2 connector stage
Replies: 5
Views: 5473

Re: Inserting the data into db2 using db2 connector stage

You will have to give us more details on what you are trying to do in your datastage job and what is not working, what stages you are using and the database you are connecting to. What are the options you are using within the database stage? Upsert or User defined updated sql etc etc
by kris007
Fri Jul 30, 2010 2:21 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: How to run the same sequencer over and over each day
Replies: 7
Views: 2004

You need to create a Job Sequence lets say A with a Start Loop, Job Activity and End Loop. The Counter in the Start Loop should be set how many times you want the job to run. Now, you need to create another Job Sequence(Master/Parent Sequence) with Job Sequence A in it and schedule the Master Sequen...