Search found 95 matches

by ewartpm
Tue Mar 22, 2011 1:49 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: DataSet LookUp
Replies: 2
Views: 2407

DataSet LookUp

The partitioning on the lookup stage is set to auto. I do a lookup to the datastage where the key values, incomming and on the dataset, are varchar. This works when the varchar contains only numerics e.g. '13300' This does not work when the varchar contains non-numerics e.g. 'VDB_120'. Has anyone el...
by ewartpm
Thu Jul 05, 2007 6:54 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Waiting for File
Replies: 6
Views: 3080

Waiting for File

I have a job sequence that has two 'wait for file' stages. The triggers from the 'wait for file' stages are set to OK. :? For some odd reason, the job sequence does not issue the two wait for files immediately. Only one wait for file is issued. When that file arrives, the next wait for file is issue...
by ewartpm
Mon Oct 16, 2006 4:34 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Sybase IQ Bulk Loader
Replies: 5
Views: 3272

Thanks for the reply. However, the client does not want to use the 'after job' option.
by ewartpm
Mon Oct 16, 2006 3:30 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Sybase IQ Bulk Loader
Replies: 5
Views: 3272

Sybase IQ Bulk Loader

Hi Guys I'm using the bulk loader because of high data volumes. In the bulk loader I have the option to code a delete statement. The delete will delete any data loaded for a period and then re-load the data being pushed to it. This has been done to cater for re-runnability. However, if no rows go do...
by ewartpm
Wed Aug 16, 2006 9:25 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Hash File Lookup
Replies: 10
Views: 4488

Thanks for the replies. Problem is I don't have 'Premuim Support' so can't see the full answer. :oops:

My boss is getting a long , long justification...
by ewartpm
Tue Aug 15, 2006 9:48 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Hash File Lookup
Replies: 10
Views: 4488

Hi Guys Thanks for the replies. No constraint specified in the job. Out of 100 000 rows, get 15 with hyphens which do not appear in the hash file. The link information says that the rows were written to the hash file. I thought maybe it was a character that Datastage displayed, when using the 'view ...
by ewartpm
Tue Aug 15, 2006 5:16 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Hash File Lookup
Replies: 10
Views: 4488

Hash File Lookup

Hi Guys I'm getting a strange occurrence when I build the hash file. The data in the source table contains a hypen i.e. '-' in this case the account number. A key column on the hash file is the account number varchar(15). Any account number with a hypen in it is not written to the hash file. There a...
by ewartpm
Wed May 03, 2006 5:59 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Parallel Extender
Replies: 8
Views: 4336

We are currently using DS 7.5.1A server version and want to move to PX. Is it possible to simply export the server jobs (dsx) and import them into PX?
by ewartpm
Fri Jan 13, 2006 4:32 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Link Variables
Replies: 3
Views: 2136

Thanks Ken and Ray, much appreciated.
by ewartpm
Thu Jan 12, 2006 9:12 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Link Variables
Replies: 3
Views: 2136

Link Variables

Does anyone know where the documentation for the link variables is hidden away e.g. link constants. I have searched through all the documentation and cannot find anything about link variables except the basics about how to use them. One would expect a pdf to published.
:cry:
by ewartpm
Wed Dec 14, 2005 12:52 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Redirect data on a warning
Replies: 2
Views: 2245

Thanks for the reply. However, my OS is Windows, not Unix.

All I want to do is somehow identify those source files I recieve who's content is UTF8 encoded. I currently run this process manually and want to automate it.
by ewartpm
Tue Dec 13, 2005 7:31 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Redirect data on a warning
Replies: 2
Views: 2245

Redirect data on a warning

I recieve several source files as input. Problem is I don't know which ones are UTF8 encoded. What I want to do is the following: Folder---Transformer---ODBC NLS set toUTF8 in the folder stage. The filename from the folder stage is written to the odbc. When I run the job, any source file that does n...
by ewartpm
Tue Aug 30, 2005 1:44 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: dsenv and odbc.ini file setup
Replies: 1
Views: 1552

dsenv and odbc.ini file setup

I need to connect to the following databases: Informix Dynamic Server 9.4 DB2 UDB 8.1.6 We are using the Data Direct Wire Protocol 4.2 drivers. We want to use the bulk loader stages for these databases (DB2 Load and Informix Loader stages). Strange thing is we can connect using the ODBC stage, but c...
by ewartpm
Tue Aug 30, 2005 1:38 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: .odbc.ini and dsenv setup
Replies: 1
Views: 1699

.odbc.ini and dsenv setup

I need to connect to the following databases: Informix Dynamic Server 9.4 DB2 UDB 8.1.6 We are using the Data Direct Wire Protocol 4.2 drivers. We want to use the bulk loader stages for these databases (DB2 Load and Informix Loader stages). Strange thing is we can connect using the ODBC stage, but c...
by ewartpm
Wed Aug 17, 2005 7:47 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: DataStage blamed again
Replies: 3
Views: 2424

DataStage blamed again

We run DataStage jobs daily. The database (Informix) is on a SAN and the hash files and sequential files are on the AIX server. The jobs are simple seq-->trfm-->seq in most cases with one or two hash file lookups. Hash files are large, most over 1,000,000 rows and at least 20 columns wide (not good ...