Page 1 of 1

DataStage Metadata issue

Posted: Fri Nov 13, 2009 5:19 pm
by yaminids
Friends,

I'm having trouble with inserting data which has CHAR columns using DataStage.

Due to new project requirements, most of the source columns are been changed to CHAR(XX CHAR) [Previously they were CHAR(XX BYTES)]

The source table has 5 columns:
1) Merch_key --> NUMBER(10)
2) Merch_Name --> CHAR(30 CHAR)
3) Merch_Addr_Line1 --> CHAR(30 CHAR)
4) Merch_Addr_Line2 --> CHAR(30 CHAR)
5) Merch_City --> CHAR(30 CHAR)

Destination table also has the same columns
1) Merch_key --> NUMBER(10)
2) Merch_Name --> CHAR(30 CHAR)
3) Merch_Addr_Line1 --> CHAR(30 CHAR)
4) Merch_Addr_Line2 --> CHAR(30 CHAR)
5) Merch_City --> CHAR(30 CHAR)

The DataStage job is supposed to populate the destination table without any transformations.

The problem is with the DataStage job. For some reason, DataStage is appending 'Spaces' (3 times the column length) at the end and as a result treating the column as CHAR(90 CHAR). When I execute the job all the rows are rejected with 'Inserted value too large, row rejected' error.

The job executes successfully when I 'Trim' the columns in the Transformer. I think this is bad as it has an adverse affect on the performance. Also, there are quite a few existing jobs and Trimming all the CHAR columns would be a cumbersome task.

I was able to insert the records manually into the destination table using SQL PLUS. That makes me think that the problem may be with DataStage.

Is there a setting at the server level which prevents adding spaces to the CHAR columns

Any help would be highly appreciated

Thanks
Yamini

Posted: Fri Nov 13, 2009 11:24 pm
by ray.wurlod
Research "derivation substitution" to reduce the cumbersome aspect.

I am not aware of any setting that changes the string pad character in server jobs - you could try setting APT_STRING_PADCHAR to 0x00 but I doubt that that would have any effect in server jobs.

Posted: Sat Nov 14, 2009 9:12 am
by chulett
A little confused... the metadata in the job still declares these fields as Char(30) or you reimported from Oracle and they now show as Char(90)? :?

It has a tendency to do that. You just need to reset the size if that's the case.

Posted: Sun Nov 15, 2009 8:02 pm
by yaminids
Craig,

I have tried that option as well. I modified the size of the columns to Char(30) but the job still fails with 'Inserted value too large, row rejected' error.

The job completes successfully if I a) Trim the columns or b) Increase the size of the columns in 'Transformer'

Thanks
Yamini

Posted: Mon Nov 16, 2009 1:41 pm
by yaminids
Ray,

Where can I set the parameter 'APT_STRING_PADCHAR'?

Thanks
Yamini

Posted: Mon Nov 16, 2009 2:09 pm
by chulett
It's an environment variable. As noted, though, I doubt it would change your Server job's behaviour at all as it is a PX specific variable.

Posted: Tue Nov 17, 2009 11:10 am
by yaminids
Is 'Trim' the only solution available?