Search found 357 matches

by richdhan
Tue Jun 14, 2005 11:52 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Insert/Update Job Design: Comments?
Replies: 10
Views: 5813

Hi Carlson

Yes, I was referring to the Change Capture stage.

The Change Capture stage will work perfectly for your situation. You have to define the fields in the change values section of the CDC stage and change_code additional column will detect the update.

HTH
Rich
by richdhan
Tue Jun 14, 2005 11:43 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Is it necessary to RESET the job after Run
Replies: 6
Views: 6290

Hi Kim, It is a good script but I have some clarifications. ################################################################################ # Runnable Job Status (do nothing) # 1 "Finished" # 2 "Finished (see log)" # 9 "Has been reset" # 11 "Validated OK" # 1...
by richdhan
Tue Jun 14, 2005 11:29 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: problem with handling nulls with timestamp field
Replies: 4
Views: 1763

Hi,

Do a search on "handle_null" in this forum. A lot of examples have been discussed before.

HTH
Rich
by richdhan
Tue Jun 14, 2005 11:18 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: DB Number Columns (no scale/precision) give warnings
Replies: 2
Views: 1106

Hi Roy,

Are you using Oracle as DB? With oracle the columns that are defined as Number are by default getting imported as Decimal(38,10). We use the DecimalToDecimal function avialable in Transformer to reduce the size as well as the precision.

HTH
Rich
by richdhan
Tue Jun 14, 2005 11:07 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Running PX jobs for a limited number of rows
Replies: 7
Views: 5314

Hi Sathish,

All these options are enabled for a server job but not for parallel job. What do you mean by limited number of rows? Do you want to run for the first 50 rows or you want to limit the number of rows based on a condition. In either case you have to use the Transformer constraint.

HTH
Rich
by richdhan
Wed Jun 08, 2005 2:00 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Column Generator : Column Method = Schema File
Replies: 5
Views: 1989

Hi Memrinal, Colum Method = Schema File is used by sequential file stage/file set stage. It is basically used when you dont want to give the metadata directly in the stage but rather use the schema file for getting the metadata information. A sample schema file looks like this schema record ( column...
by richdhan
Wed Jun 08, 2005 1:50 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Getting Insert and Update rows count
Replies: 4
Views: 2452

Hi Madhav,

Use the CDC stage. It gives an additional column change_code which is used to identify insert, update, delete or exact copy. Pass the data from CDC to a transformer and using stage variables you can find the count of inserts, updates and total count.

HTH
Rich
by richdhan
Wed Jun 08, 2005 1:46 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Insert/Update Job Design: Comments?
Replies: 10
Views: 5813

Hi lshort, Look into the CDC stage. The documentation in parjdev.pdf gives additional information on CDC stage. It is used to identify new inserts, updates as well as deletes. You can finish it in one job. The CDC stage requires 2 inputs. Use the flat file as one input and the DBMS data as the other...
by richdhan
Wed Jun 08, 2005 1:36 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Need help with a sparse lookup
Replies: 4
Views: 2412

Hi Vincent, The sparse lookup is different from the normal lookup. The sparse lookup sends individual SQL statement for every incoming row. It is useful when you want to get the next sequence number from oracle or DB2 sequence. Since it is sending a SQL statement for every incoming row it should not...
by richdhan
Mon May 16, 2005 11:13 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Reading rows using schema file
Replies: 5
Views: 3874

Hi NewPXUser,

Did you try the solution I had provided.

Rich
by richdhan
Thu May 12, 2005 8:12 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: stage variable usage
Replies: 4
Views: 2874

Hi Harithay,

If you know stage variables can do it and dont know how to write them then do a search on stage variables in the Server forum. There are a lot of examples that have been discussed.

Rich
by richdhan
Thu May 12, 2005 8:09 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Reading rows using schema file
Replies: 5
Views: 3874

Hi NewPXUser, If you want to use a schema file for a sequential file then you create a dummy job which loads a fileset with the following column configuration. Open the fileset and get the schema structure. Use the schema structure for the sequential file. Hope this resolves all your issues. Keep us...
by richdhan
Fri May 06, 2005 2:28 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Modify Stage
Replies: 9
Views: 5510

Hi Bird,

This has been discussed before. Do a search on handle_null. Modify stage uses orchestrate functions. Use handle_null in the modify stage specification.

Rich
by richdhan
Sat Apr 30, 2005 10:45 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: sort stage
Replies: 2
Views: 1783

Hi, From your post I believe you are using a sort stage followed by a join stage. I dont think this is necessary. The sorting can be defined in the join stage itself. In the join stage are you using hash partitioning? In the hash partitioning have you set the perform sort option? If you are using jo...
by richdhan
Mon Apr 25, 2005 7:31 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Accessing NON-partitioned Oracle table
Replies: 1
Views: 1035

Hi,

Look into this post.

Yes, regardless of what type of Oracle table is accessed, the select privilege on the above system tables should be granted in order for a parallel job using Oracle Enterprise stage to execute successfully.

HTH
Rich