Search found 464 matches

by WoMaWil
Fri Jul 11, 2003 3:21 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Reading rows from and adding rows to a hashfile in
Replies: 7
Views: 3011

Hi Victor,

why do you want to use the link-collector? A Hash-File as target is normaly enough for this task. This is something standard, what you are doing. I don't know of any project, which has not such feature as you describe them.



Wolfgang Huerter
=====================
Cologne, Germany
by WoMaWil
Thu Jul 10, 2003 8:43 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: More about Phantoms
Replies: 9
Views: 1770

What is your problem, Emma? Having such phantoms during a job is runing is quite normal, so don't mind having them. If the Job doesn't work or hash-files are not filled, the you have a problem, which you have to analyse. In the &PH&-directory, Kim and Ray have mentioned you may find hints fo...
by WoMaWil
Thu Jul 10, 2003 8:37 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Plug-in Development
Replies: 3
Views: 1516

This guide is a separate documentation which you have to request seperatly from Ascential. If you want to start one of your existing tools from DataStage you don't have to write such a plugin, there are many easier way. To write a plug-in is not easy. If you want some Infos about VB and DataStage, p...
by WoMaWil
Tue Jul 08, 2003 8:12 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: How to Delete a set of Records from the Hash File
Replies: 5
Views: 1388

Hi,

If you use a UV-Stage instead of a hash-Stage you are even able to delete records with a job.


Wolfgang Huerter
=====================
Cologne, Germany
by WoMaWil
Tue Jul 08, 2003 8:06 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: ODBC and native connection to same data source?
Replies: 7
Views: 2383

Hi Steve, it is true, we have and had same problems on 5.2, and last week, a job, which ran half a year after a change we made, without thinking about the sense it worked. So try putting seq-files in between or change hashStage to UVStage, which is about the same and for sure you will be surprise yo...
by WoMaWil
Tue Jul 01, 2003 9:01 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Transform multiple columns into multiple records
Replies: 2
Views: 978

Hi Gabriel, it is easy to manage, it depends if n is fix or variable. If it is fix you take for every colum from table1 a link to you table2, so each link writes a row. If it is variable you have to use the multivalue function of a hash file. So much success for you. Wolfgang Huerter ===============...
by WoMaWil
Fri Jun 27, 2003 3:02 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: status job
Replies: 2
Views: 1169

Hi Angel, that's interaction between Windows (Client) and Server. Maybe that the last client process crashed and the Server believes you are still on. So you have two chances. Wait until the end of the timeout (mostly an hour but theorethical eternety is also possible if timeout=0 look in administra...
by WoMaWil
Wed Jun 25, 2003 5:11 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Using Environment Variables
Replies: 7
Views: 2282

Being root or dsadm (in Version 5.x and greater) don't forget that when starting the dsengine all you environment settings are also inherited.



Wolfgang Huerter
=====================
Cologne, Germany
by WoMaWil
Tue Jun 17, 2003 4:56 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Manual Create index in Universe Stage
Replies: 5
Views: 1265

Hi cheery,

Universe Table and Hash-File is the same. All key field are automaticly indexed. Read the Manual concerning Hash-files and Universe, there are many more things to keep in mind about them.

Wolfgang Huerter
=====================
Cologne, Germany
by WoMaWil
Mon Jun 16, 2003 5:18 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: ODBC for Sybase IQ
Replies: 6
Views: 5848

Shalom Dina,

Did you restart DataStage after changing dsenv?


Wolfgang Huerter
=====================
Cologne, Germany
by WoMaWil
Fri Jun 13, 2003 6:49 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Internal data error
Replies: 3
Views: 2188

Ciao Mario, I don't have an ad-hoc answer for you: What I would do is: - to try with a shorter file_name - how many links write to the file, if more than one, do make a dummystage at the beginning of the job for re-create or/and clear file. What about your hard disk, is enough space left? Wolfgang H...
by WoMaWil
Fri Jun 13, 2003 6:40 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: ODBC Stage Performance
Replies: 1
Views: 958

Hi Simon, perfomance is a science for itself. You never can set up general rules for optimal or better performance. Even those who are saying native is factor 2 to 3 better than ODBC are wrong, I have seen situations where ODBC was about 20 % quicker than native. Your only chance is to try and error...
by WoMaWil
Wed Jun 11, 2003 8:15 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Sequential file issue
Replies: 5
Views: 1978

Hi Reg, concerning (1) fixed width is no *.csv so if you say in Format first line = column names it will be ignored, so this shouldn't be a problem. concerning (2) My expierence is: If you can switch your users away from Excel to Access or anything else: do it. This will save you a lot of painfull c...
by WoMaWil
Fri Jun 06, 2003 6:58 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Managing Category folders
Replies: 3
Views: 911

Hi PJ,

try to switch between View-"Extended Job View" and back. Some Functions are not available in one of them and some not in the other.

Wolfgang Huerter
=====================
Cologne, Germany
by WoMaWil
Fri Jun 06, 2003 6:21 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: how to forbid the warning log in director
Replies: 10
Views: 3912

Hi XiangMing, Could you divide perhaps your job in more runs, so you could tell datastage to clear the logfile after each run. Lets say, you have 1000 row to load, you may load 10 times 100 rows instead and your Logfile will have 10% of the size. This could be done by batch for example. For the redi...