Search found 67 matches

by CharlesNagy
Mon Jul 07, 2008 4:08 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Using an Aggregator stage to calculate Year to date values
Replies: 3
Views: 2914

Using an Aggregator stage to calculate Year to date values

I Have incoming data sorted by YearMonth, eg YYYYMM, then BusinessUnit, InvGroup & InvItemGroup, which is aggregated to give a monthly total in the following job structure: Oracle Read ---- Transformer ------ Aggregator -------Transformer I now need to populate a new field, YTD, with cumulative ...
by CharlesNagy
Mon Jun 16, 2008 1:32 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: How to speed up multiple lookups in a Databasic routine
Replies: 11
Views: 2505

The public disk caching is a DataStage engine level setting and can be enabled per hashed file and is independant of which methods are used to read/write to the files. So it will work from BASIC code just as it does in jobs. I just took a quick check and it seems that the "dsdskche.pdf" d...
by CharlesNagy
Fri Jun 13, 2008 9:26 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: How to speed up multiple lookups in a Databasic routine
Replies: 11
Views: 2505

That's the document. I can't recall the exact numbers, but the maximum amount of memory useable for the cache is initially small but can be expanded. There are a couple of "gotchas" with caching and I've managed to get the DS instance so confused that not a single r/w operation worked unt...
by CharlesNagy
Fri Jun 13, 2008 9:10 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: How to speed up multiple lookups in a Databasic routine
Replies: 11
Views: 2505

Of course the DataStage hashed file caching works quite well - but I'm sure you know that and have reasons for not being able to use this. Depending upon how important this is to you, you can enable memory file caching for DataStage at the system level and use that; this would mean that your Hashed...
by CharlesNagy
Fri Jun 13, 2008 7:42 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: How to speed up multiple lookups in a Databasic routine
Replies: 11
Views: 2505

It probably is the reads. I've sometimes included my own cache in programs of this type by creating a Key and a Data dynamic variable for each file. The keys & data are appended as new values to these strings, and I check up on the Key (using LOCATE) before trying to do an actual read. In many ...
by CharlesNagy
Fri Jun 13, 2008 3:38 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: How to speed up multiple lookups in a Databasic routine
Replies: 11
Views: 2505

How to speed up multiple lookups in a Databasic routine

I am trying to speed up a routine called from a transformer, which, based on the arguments fed to it, reads from various lookup hashed files to return the values required for each row. Each time the routine is called it may read from several hashed files in order to gather the information. Commons a...
by CharlesNagy
Mon Apr 28, 2008 2:56 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Can you specify DOS or UNIX format in Writeseq command?
Replies: 2
Views: 1078

Thanks Kim,

Will play with your suggestions and see which works best for us.

much appreciated..
kduke wrote:You can fix this when you FTP to Windows from Linux. You can also use UNIX2DOS and DOS2UNIX commands.
by CharlesNagy
Thu Apr 24, 2008 7:31 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Can you specify DOS or UNIX format in Writeseq command?
Replies: 2
Views: 1078

Can you specify DOS or UNIX format in Writeseq command?

We have a server routine that writes out our metedata to sequential files on our server. We can then import them using Manager. This worked fine on Unix, however, since we changed to Linux, the import fails, and we now have to edit these files first, resave them in DOS format, and then we can import...
by CharlesNagy
Thu Apr 03, 2008 10:44 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Writing to the Log file from a transformer in a Parallel Job
Replies: 4
Views: 1489

Function to write to the Log File

After closer inspection, cout actually works. I wrote the following function which I call from a transformer, and it does indeed write to the log file. void writeLog(char *logMsg) { cout << logMsg << endl; return ; } It appears in the log with an APT_CombinedOperatorController event...
by CharlesNagy
Thu Apr 03, 2008 7:46 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Writing to the Log file from a transformer in a Parallel Job
Replies: 4
Views: 1489

Thanks Ray, Will check it out... It will slow the job down immensely but it can be done. It's probably best to call your routine from a stage variable, so that you can discard the result. Write your routine using the C-language DataStage API; you probably need DSLogInfo() or DSLogWarn() somewhere wi...
by CharlesNagy
Thu Apr 03, 2008 4:17 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Writing to the Log file from a transformer in a Parallel Job
Replies: 4
Views: 1489

Writing to the Log file from a transformer in a Parallel Job

Hi, I am trying to write a message to the logfile from a parallel transformer if a certain condition is not met... eg: If condition then pass Data else log message I presume I will have to write a C++ routine, but I dont know what command to issue to accomplish this. I have tried a simple cout, but ...
by CharlesNagy
Tue Feb 12, 2008 6:32 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Field Null Value Property not visible
Replies: 9
Views: 3058

Ok, then remove the error-causing default "null-field" attribute from the column defnition. Thanks, but we tried that too and it didnt help. Actually, by a dint of experimenting we found the answer.... If you try to edit the metadata in the Oracle stage, you will not see the null field op...
by CharlesNagy
Tue Feb 12, 2008 5:05 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Field Null Value Property not visible
Replies: 9
Views: 3058

You need to add a valid date as the null value, i.e. "2008-12-31 23:59:59", assuming your system default is yyyy-mm-dd Thanks, but we are doing this with the following code in the transformer: NullToValue(Read_Table.JOURNAL_LINE_DATE, '1900-01-01 00:00:00') Hence we are explicitly padding...
by CharlesNagy
Tue Feb 12, 2008 4:36 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Field Null Value Property not visible
Replies: 9
Views: 3058

Null Field Value not on Properties to add list

We are attempting to use an Oracle Enterprise stage to read some data. When processing a Timestamp field, we get the following error: Oracle_Enterprise_11: When checking operator: When validating export schema: At field "JOURNAL_LINE_DATE": "null_field" length (0) must match fiel...
by CharlesNagy
Fri Dec 14, 2007 8:54 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: CREATE.FILE usage for 64 bit hashed files
Replies: 30
Views: 8522

Yup, and Ray has laid it all out in the past, let me see if I can find it... ah: http://www.dsxchange.com/viewtopic.php?t=86639 Wow, "UVFIXFILE filename VLEVEL 0 TRACE 1", and if it doesn't work - then it is 64 BIT, thats so obvious, why didn't I think of that? :wink: Thanks a bunch, I wo...