Search found 49 matches
- Mon May 01, 2006 10:47 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Sequencer not in MVS
- Replies: 7
- Views: 2017
- Sun Apr 30, 2006 5:47 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: How to capture the rows rejected by DB2 API Stage.
- Replies: 12
- Views: 4022
surojeet DB2 API stage is a sequential stage and based on the network bandwith u have betn the mainframe and AIX server, it will further slow it down. For inserting large volumes of data, it will be faster to FTP the sequential files to the mainframe and load it thru a DB2 load utility, which slams ...
- Sun Apr 23, 2006 6:15 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Cobol COMP field Issue in DS
- Replies: 11
- Views: 4239
- Sun Apr 23, 2006 8:19 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Cobol COMP field Issue in DS
- Replies: 11
- Views: 4239
- Fri Apr 21, 2006 2:19 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: How to capture the rows rejected by DB2 API Stage.
- Replies: 12
- Views: 4022
- Wed Apr 19, 2006 7:13 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: About UDB API stage
- Replies: 21
- Views: 6281
- Tue Apr 11, 2006 5:54 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: HAWK requirements
- Replies: 6
- Views: 2090
- Mon Apr 10, 2006 9:13 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: HAWK requirements
- Replies: 6
- Views: 2090
HAWK requirements
I couldn't find the right forum to post HAWK related discussions, so, i am posting it here! We are getting ready for the HAWK BETA on AIX, which is supposed to be out by mid- May'06. Does anybody know what are the ideal machine requirements for the AIX box to put down HAWK BETA. Here is what i am lo...
- Sat Apr 08, 2006 9:58 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Hash file look up not working
- Replies: 10
- Views: 4334
- Sat Apr 08, 2006 12:24 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Hash file look up not working
- Replies: 10
- Views: 4334
Why bother to do a lookup to omit 1st and last row. u can put the foll. constraint in the transformer:
If the constraint evaluates to TRUE then u shd all the rows except the 1st & last.
Code: Select all
@INROWNUM > 1 and @INROWNUM <> @OUTROWNUM
- Sat Mar 11, 2006 11:39 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: SQL server connection problem
- Replies: 5
- Views: 2078
- Sat Mar 11, 2006 10:28 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Problem in using StoredProcedure
- Replies: 4
- Views: 1859
- Thu Mar 09, 2006 9:39 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: converting 100s of rows into 100s of columns
- Replies: 6
- Views: 1839
- Thu Mar 09, 2006 7:22 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: converting 100s of rows into 100s of columns
- Replies: 6
- Views: 1839
Thanks for the response! The solution u provided ensures i get all the data for a given customer on a single row, but, how do i make sure i map the correct attribute to the corresponding value of the attribute. Since each attribute is a column on my o/p, if the incoming data does not have a row for ...
- Wed Mar 08, 2006 9:35 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: converting 100s of rows into 100s of columns
- Replies: 6
- Views: 1839
converting 100s of rows into 100s of columns
Hi I know a lot of pivot (vertical and horizontal) stuff has been covered on DSX, but, i some how could not find any optimal soln for what i am trying to accomplish. We have a situation where we have several hundreds of rows for a given customer and all these rows need to be transformed into one uni...