Search found 19 matches

by dwscblr
Thu Jul 15, 2004 1:49 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Creating Hashfiles: size limitations
Replies: 13
Views: 7294

Creating Hashfiles: size limitations

I have to load 40 million records into a hashfile. The size of each row in the Hashfile is 50 bytes. When I created it as dynamic hashfile it failed after a million rows with the error: Abnormal termination of stage CopyOfint2outLoadAddrProcResHash..LoadHashCodedRecordsOut.IDENT1 detected So I moved...
by dwscblr
Tue Jul 13, 2004 10:42 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Balancing input and output record counts
Replies: 13
Views: 4839

If you want job statistics in a file, Call the DSendJobReport with the argument 2;directorypath in the "After routine" of the job for which the statistics are required. This creates the job log in text format in the directorypath specified. This will give you number of records read, writte...
by dwscblr
Tue Jul 13, 2004 10:36 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Sequential File: Too many columns recieved than expected.
Replies: 3
Views: 1544

I found way to achieve the same. In the transformer stage I added the constraint following constraint to the outputlink: NOT(OutputLink.REJECTED=@FALSE) But this is exactly the opposite of what the DataStage documentation states. Documentation excerpt: Input and Output link variables are predefined ...
by dwscblr
Mon Jul 12, 2004 7:31 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Sequential File: Too many columns recieved than expected.
Replies: 3
Views: 1544

Sequential File: Too many columns recieved than expected.

We have a DataStage server job that read from a sequential file, transforms the data and outputs it to another sequential file. Input is a quote delimited file and is supposed to have only 15 columns. But the data we recieve from our source is erroneous at times. In a file, some rows are sent with l...