Search found 262 matches
- Mon Nov 13, 2006 8:54 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Unwanted Folders
- Replies: 3
- Views: 1213
Unwanted Folders
Hi, The 'Ascential\Datastage\Projects\TEST' folder has grown up in size to a large amount. I can see many such folders (approx count of each type is 3000) DS_TEMPxxxx RT_BPxxxx RT_BPxxxx.O RT_CONFIGxxxx RT_LOGxxxx RT_STATUSxxxx. Please suggest me to which one is for what and wheteher I can remove so...
- Fri Nov 10, 2006 7:51 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Timestamp Scale = 6 and Allow Nulls = 'Y'
- Replies: 1
- Views: 984
Timestamp Scale = 6 and Allow Nulls = 'Y'
Hi,
I cant set Timestamp Scale = 6 and Allow Nulls = 'Y'
together for the Timestamp type of field.
I am using DB2 as the Database.
u may refer 'Timestamp and DB2' post from me. I did not get
is resolved. Plz help.
Thx
I cant set Timestamp Scale = 6 and Allow Nulls = 'Y'
together for the Timestamp type of field.
I am using DB2 as the Database.
u may refer 'Timestamp and DB2' post from me. I did not get
is resolved. Plz help.
Thx
- Thu Nov 09, 2006 2:46 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Processing Large Volume of Data
- Replies: 13
- Views: 3564
Well, my case was just a general case. Lots of transformations, lookups etc etc. I did all that and then captured the results in a flat file. Then my final job would pick that file and load it to the UDB table using the load utility. It would a hr8 help, if u may share the following info: 1. What w...
- Wed Nov 08, 2006 2:49 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Processing Large Volume of Data
- Replies: 13
- Views: 3564
- Wed Nov 08, 2006 2:36 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Processing Large Volume of Data
- Replies: 13
- Views: 3564
I highly doubt you can update data using the load utility. As the name suggest, it can just load, in other words, LOAD, INSERT, REPLACE etc and not update. Updates will be handled seperately as a logged activity (DML). Well, can u comment sth on the performance level of DB2 Bulk Laod? Waht is rate ...
- Wed Nov 08, 2006 2:17 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Processing Large Volume of Data
- Replies: 13
- Views: 3564
Are you getting that amount of data on a daily basis??? If yes then its going to be update too right? If this is for the historical load then i would say yes, go for the db2 load stage. Explore it, search here to set its properties correctly. I am going to get data on daily basis, but it wont be up...
- Wed Nov 08, 2006 2:08 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Processing Large Volume of Data
- Replies: 13
- Views: 3564
Processing Large Volume of Data
Hi, If suppose I have to procss say some 200 millions of rows which have to undergo some transformations and validations and then ladn into a Database table. Now, what I am doing currently is doing all validations adn lookups and directly insert (Insert without clearing) into a table. What I want to...
- Tue Nov 07, 2006 3:35 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: DB2/UDB Bulk Load
- Replies: 1
- Views: 563
DB2/UDB Bulk Load
Hi, I what conditions do we go for DB2 Builk Load? Can any one help me with setting up the propwerties to use DB2 Bulk Load stage? I am not able understand what path has to be given for: 1. Directory for Data and Command (here I just gave C:\) 2. Local Message File name (here I dont know what to set...
- Mon Nov 06, 2006 5:09 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Accessing DB2 using ODBC
- Replies: 2
- Views: 708
Accessing DB2 using ODBC
Hi, I am trying to access Db2 using the ODBC stage, but I am getting the following error for all the columns. ODBC_88.DSLink86: DSD.BCIOpenR results of SQLColAttributes(BUSINESS_UNIT) gave MetaData mismatch COLUMN.TYPE Expected = Char Actual = Unknown Please tell me a workaround. I want to use ODBC ...
- Mon Nov 06, 2006 4:10 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Transfer a Batch of Data over a link
- Replies: 6
- Views: 1176
:? 'Complete batch'? I'm guessing from your 'usually it just send the last row of a batch of rows' comment that you want a lookup to return a 'multi-row result set'. If that's the case, there are pleny of conversations here on the subject that a search would turn up. And one of the first things you...
- Sun Nov 05, 2006 6:00 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Transfer a Batch of Data over a link
- Replies: 6
- Views: 1176
Transfer a Batch of Data over a link
Hi,
I want to transfer a complete batch of data over a link, instead of row by row. (The batch of data might be result of a query or say a lookup).
Usually what happens is that it just send the last row of a batch of rows.
Pelase suggest sth.
Thx.
I want to transfer a complete batch of data over a link, instead of row by row. (The batch of data might be result of a query or say a lookup).
Usually what happens is that it just send the last row of a batch of rows.
Pelase suggest sth.
Thx.
- Sat Nov 04, 2006 6:09 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Comma to Decimal Point Substitution
- Replies: 2
- Views: 1370
Comma to Decimal Point Substitution
Hi Friends,
In my database, the Decimal values use comma(,) instead of deciaml points (.).
The values when read in DataStage hence treated as String (e.g -15,89 which actualy is -15.89).
How do I convert the comma into decimal??
Thx.
In my database, the Decimal values use comma(,) instead of deciaml points (.).
The values when read in DataStage hence treated as String (e.g -15,89 which actualy is -15.89).
How do I convert the comma into decimal??
Thx.
- Sat Nov 04, 2006 4:38 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Max of the Data
- Replies: 11
- Views: 2340
Perform a join with the keys (F1,F2). Pass the output to an Aggregator. Group by (F1,F2). For the derivation of F3 and F4, use 'Last' (Assuming they are sorted...if not, use MAX(). gateleys Hi, Please chk the example in the the post above. Also explain me, that since the result of the query from th...
- Fri Nov 03, 2006 12:09 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Max of the Data
- Replies: 11
- Views: 2340
Can you explain in a small simple mapping view kind of thing the above requirement like -- Hashed File | | A ---- Trans ------ Because I am now confused , and will like to know why we can not perform the MAX thing. We cannot do the MAX thing coz of 'Row By Row' processing becosue of which the DB St...
- Fri Nov 03, 2006 11:26 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Max of the Data
- Replies: 11
- Views: 2340
Can not you do this before putting the data in the hashed file == Select F1#,F2#,MAX(F3#),MAX(F4#) from A group by F1#,F2# That way your hashed file will have only one record corresponding to the F! and F2 fields and then you can do the lookup with out any comlex logic. Hope this makes sense. No No...