I also get the same error code. However, if I manually reset the failed job using Director, it runs fine after that.
So I believe it gives this error code if you try to run a failed job even if you specify "reset and run" option in dsjob...
Search found 7 matches
- Mon Jun 27, 2005 9:40 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: PX Jobs reports failure (code 255)
- Replies: 9
- Views: 8345
- Thu Apr 07, 2005 10:22 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Sequential File Size Limit
- Replies: 1
- Views: 1847
Sequential File Size Limit
I'm trying to import a 1GB sequential file using the Sequential File Stage in EE 7.5. The job aborts at the last record saying the data is bad. This is the error msg: StatementDetailFile,0: Short read encountered on import. This most likely indicates one of the following possibilities: 1) The import...
- Wed Nov 10, 2004 1:21 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Handling of CHAR(x) Data: Differences between 5.2 and 7.x?
- Replies: 4
- Views: 1891
Re: Handling of CHAR(x) Data: Differences between 5.2 and 7.
I have the same problem:
Table1 has Col1 as VARCHAR2(20) (Oracle). A Lookup on that column returns a 4096-byte long string, precisely 1 character + 4095 spaces. If I put a TRIM() in the lookup SQL I still get the maximum allowable varchar2.
So, I'll join Ray on this one...
Table1 has Col1 as VARCHAR2(20) (Oracle). A Lookup on that column returns a 4096-byte long string, precisely 1 character + 4095 spaces. If I put a TRIM() in the lookup SQL I still get the maximum allowable varchar2.
So, I'll join Ray on this one...
- Tue Aug 24, 2004 2:02 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Doing Maximum of character column in an Aggregator
- Replies: 6
- Views: 8041
Update from Ascential
Just to let you guys know that I spoke to Ascential support and they told me this is a bug that will (probably) be fixed in the next release (7.6). In summary, if you use the aggregator stage to do a MAX on a character column, the value gets converted to dfloat and back to string yielding an incorre...
- Mon Aug 09, 2004 3:28 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Doing Maximum of character column in an Aggregator
- Replies: 6
- Views: 8041
Re: Doing Maximum of character column in an Aggregator
I am using Parallel Extender. In my aggregator, I have a Column for Calculation. In the Maximum Value Output Column, I have 13 character values. I have defined this column as a character value throughout but it contains numbers only. I would like the aggregator to do maximum on it. So I have values...
- Thu Jul 01, 2004 10:35 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Complex flat file to parent-child transforms
- Replies: 2
- Views: 3191
It's a smart question and this is the right place to ask it. You need to identify the primary key of your parent record. You then de-duplicate your child records based on this key to create your parent records. Here is one design: Read in your child records and write them out to the target child da...
- Wed Jun 30, 2004 10:42 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Complex flat file to parent-child transforms
- Replies: 2
- Views: 3191
Complex flat file to parent-child transforms
I'd like opinions/suggestions on the following problem: I want to load two tables in a target database, let's call them PARENT and CHILD. I have a flat file generated from the mainframe containing records at the CHILD level. I'd like to load both PARENT and CHILD from this file, having the PARENT da...