Search found 65 matches

by ccatania
Wed Sep 05, 2007 7:41 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: who to Initialize dynamic values in parameter of DS job
Replies: 5
Views: 1237

If what Josh stated should be all you need, if that doesn't work you can try this solution. If you are looking to capture the time that the job aborted, you can write this to a data table. In a server job source this database using the max timestamp. In the server job transform you can execute your ...
by ccatania
Wed Aug 29, 2007 6:18 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Aborting job if too few rows are processed
Replies: 9
Views: 2775

Unless I'm missing something.

Your Stage Variable is being incremented with each record processed, the constraint test that value during that same process. The Transform Stage doesn't have to complete before you can test the SV value.

Charlie
by ccatania
Tue Aug 28, 2007 1:53 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: DataTypeEbcdicToAscii in Parallel jobs
Replies: 5
Views: 2748

Use the Complex Flat file stage for your source flat file - see PJD manual

charlie
by ccatania
Tue Aug 28, 2007 1:49 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Date Conversion
Replies: 5
Views: 1960

Try StringToDate(inputdatefield, '%mm%dd%2000yy")
by ccatania
Tue Aug 28, 2007 1:47 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Date Conversion
Replies: 5
Views: 1960

Try StringToDate(inputdatefield, '%mm%dd%2000yy")
by ccatania
Tue Aug 28, 2007 1:41 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: 0 rows transferred
Replies: 5
Views: 2319

a few questions first:
How many rows are in the input files

Are there any constraints present

What is your taget file environment
by ccatania
Tue Aug 28, 2007 1:38 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Aborting job if too few rows are processed
Replies: 9
Views: 2775

In your first tranform use a stage variable to count the rows, then have that value tested in a constraint. You can then eliminate the other transform and the aggregator stage.
by ccatania
Tue Aug 28, 2007 1:35 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Aborting job if too few rows are processed
Replies: 9
Views: 2775

In your first tranform use a stage variable to count the rows, then have that value tested in a constraint. You can then eliminate the other transform and the aggregator stage.
by ccatania
Fri Jun 22, 2007 2:07 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Derivation Advise
Replies: 9
Views: 3584

work backwards, meaning test for all 3 fields to be empty first, then test for 2 then the 1. if col1 isnull and col2 isnull and col3 isnull then if col1 = isnull and col2 is null then values else if col1 isnull the value else '' else " I think this should give you the results you are looking fo...
by ccatania
Fri Apr 06, 2007 3:00 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Surrogate Key generator
Replies: 7
Views: 4743

I had a similar requirement, where the sur-key was incremented in one process and had to used in another. I wrote the Sur-key value to a work table, then in a Server job did a select max on that value, passed it to a variable and executed the next job through the server transform. The job executed f...
by ccatania
Fri Apr 06, 2007 2:17 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: schema
Replies: 14
Views: 4598

If you are unable to view the data from the input source, try a different delimiter setting eg Tab. You may have to try different setting to find the one that works. You mentioned that your source is a flat file. check to see-if you can- what it looks like from where it's created.
by ccatania
Fri Dec 01, 2006 3:13 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Teradata TPUMP Invalid Timestamp error
Replies: 5
Views: 2312

I thought that the Display column in the Teradata transform was for viewing the data through DS, and not how the field would be loaded to the table.
So if I understand you, the Teradata Display column length would have be the same as the actual Teradata field column length. Interesting....
by ccatania
Fri Dec 01, 2006 2:55 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Teradata TPUMP Invalid Timestamp error
Replies: 5
Views: 2312

Thanks for the reply Ray. The third portion is :00, no mille-secs and there is a delimeter to allow for the date/time space. The developer said he changed the Display on the TD Column tab to equal the column length field and it he can load to TD without any invalid timestamp error. :?: I haven't had...
by ccatania
Fri Dec 01, 2006 8:03 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Teradata TPUMP Invalid Timestamp error
Replies: 5
Views: 2312

Teradata TPUMP Invalid Timestamp error

After concatenating 2 string fields a date field:time field:('00') to a StageVariable, then a StringtoTimestamp conversion, we received the Invalid timestamp error when loading in TD using Tpump. The TD TimeStamp field is defined with a TimeStamp(0) therefore mille-seconds not used and Nulls Yes. In...
by ccatania
Fri Oct 06, 2006 10:54 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Teradata - Field type must be either an int8, int16, ui
Replies: 1
Views: 1720

Teradata - Field type must be either an int8, int16, ui

I had this problem in the past and I thought I remembered the solution, but nothing I have tried has resolved the problem. The source is Teradata usig Fastload and the fields in the query match to the columns, order and SQLtype. This is the error: SRCE_VNDR_PO,0: Field type must be either an int8, i...