Search found 93 matches

by MarkB
Wed Apr 27, 2011 9:42 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: in operator in Transformer
Replies: 3
Views: 1681

Re: in operator in Transformer

Nagac wrote:I could not find IN operator in Transformer.
That's because there isn't one ... a simple search here would show this topic discussed many times over. You can use the Index function to get the results you need.
by MarkB
Mon Apr 25, 2011 8:21 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Data file having header and taielr. Want to read the file wi
Replies: 7
Views: 6629

In a Before Job routine, do an ExecSh and head -1 the sourcefile to one file and tail -1 the sourcefile to another. Let your job run as-is; the header and trailer rows will drop off as rejects because of not meeting the file specifications. In an After Job routine, do an ExecSH and cat the header fi...
by MarkB
Wed Apr 20, 2011 10:24 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Reading Flat file (CSV)
Replies: 13
Views: 8595

Second option wouldn't work if there were two commas in a field. If it were free text and they allowed commas why wouldn't there be two. ...which is why, in this case, the only logical solution is Craig's - the file is bad ... go back to the source and have them give you the file in the correct for...
by MarkB
Wed Apr 20, 2011 10:08 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Data loss potential ODBC stage issue.
Replies: 1
Views: 1136

Since you are loading the same table twice in the same job, my guess would be that your deletes are stepping on each other. My question is, why the deletes? If you are treating a dimension as a SCD1, you don't care about history and are overwriting existing rows and inserting new ones, so shouldn't ...
by MarkB
Thu Apr 14, 2011 1:51 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: rundate from header record of the file
Replies: 13
Views: 7187

jwiles wrote:If the file is a basic text file, delimited or not, just read the entire record into one column ("Record" in the earlier posts) for all of the rows in the file. You can parse the data records later using Column Import.

Regards,
Like I said, more than one way to do it :D
by MarkB
Thu Apr 14, 2011 12:44 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: rundate from header record of the file
Replies: 13
Views: 7187

You haven't said much about the data in your file. Is it always one header record followed by detail records? If so, then one way to do this is to call your job from a Sequence job. Your job would have a parameter - call it RunDate. Create a Sequence. The sequence would be an Execute Command. The co...
by MarkB
Thu Apr 14, 2011 5:53 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Removing Tabs and Carriage Returns
Replies: 2
Views: 1858

Re: Removing Tabs and Carriage Returns

Those are not CRLF ... try char(13) and char(10), not 12 and 15.
by MarkB
Tue Apr 12, 2011 7:29 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: DataStage Hangs when Trying to Import From Lotus Notes
Replies: 6
Views: 5896

Unfortunately no ... a case has been opened with IBM and the engineers are looking at it.
by MarkB
Mon Apr 11, 2011 2:02 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Convert function ??
Replies: 3
Views: 4936

kumar_s wrote:Did you try Convert('(e)-','',Input.Filed) ...
I don't think that is what he wants, as that would remove the 'e'. On the other hand, Convert('()-','',"nar(e)-sh") should result in naresh.
by MarkB
Mon Apr 11, 2011 7:23 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Need to Improve Oracle to Netezza Performance
Replies: 5
Views: 2535

Considering Netezza is a database appliance and his source database is Oracle, they are obviously on a different server :roll: . It's possible your network is slowing you down. Are you running directly from Oracle to Netezza with no other steps (Transformer, etc)? Have you tried writing the Oracle d...
by MarkB
Fri Apr 08, 2011 6:01 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Need to know about the scratchdisk ?
Replies: 3
Views: 2924

When's the interview :roll: ??

You can read up on this in the Parallel Job Developer guide under the Configuration File section.
by MarkB
Mon Apr 04, 2011 9:54 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: DataStage Hangs when Trying to Import From Lotus Notes
Replies: 6
Views: 5896

DataStage Hangs when Trying to Import From Lotus Notes

From searching the forums, I see that connecting to Lotus Notes is a pain. I am in a Windows environment and have the current NotesSQL (8.5) driver installed. I created a test ODBC system DSN to a local Lotus nsf file. Using the example program provided by IBM (in the ODBCDrivers/Example directory),...
by MarkB
Wed Mar 30, 2011 10:28 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Complex Flat File stage
Replies: 1
Views: 1576

Why not try it? You are allowed to specify ASCII and not EBCDIC, and Variable length is also an option.
by MarkB
Wed Mar 30, 2011 8:31 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: ORA-03113: end-of-file on communication channel
Replies: 12
Views: 14699

That's a generic Oracle error stating that the connection to the server timed out or was lost. It could have been an Oracle server issue or a network issue.
by MarkB
Wed Mar 23, 2011 12:31 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Not able to reject rows from transformer
Replies: 8
Views: 8143

Is the only thing you are doing in the transformer checking for nulls? If all you are doing is taking a sequential file, checking one (or more) fields for nulls, then outputting to your target, it would be easy to simply define those fields in the sequential file source as non-nullable, hang a rejec...