DataStage Metadata issue

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
yaminids
Premium Member
Premium Member
Posts: 387
Joined: Mon Oct 18, 2004 1:04 pm

DataStage Metadata issue

Post by yaminids »

Friends,

I'm having trouble with inserting data which has CHAR columns using DataStage.

Due to new project requirements, most of the source columns are been changed to CHAR(XX CHAR) [Previously they were CHAR(XX BYTES)]

The source table has 5 columns:
1) Merch_key --> NUMBER(10)
2) Merch_Name --> CHAR(30 CHAR)
3) Merch_Addr_Line1 --> CHAR(30 CHAR)
4) Merch_Addr_Line2 --> CHAR(30 CHAR)
5) Merch_City --> CHAR(30 CHAR)

Destination table also has the same columns
1) Merch_key --> NUMBER(10)
2) Merch_Name --> CHAR(30 CHAR)
3) Merch_Addr_Line1 --> CHAR(30 CHAR)
4) Merch_Addr_Line2 --> CHAR(30 CHAR)
5) Merch_City --> CHAR(30 CHAR)

The DataStage job is supposed to populate the destination table without any transformations.

The problem is with the DataStage job. For some reason, DataStage is appending 'Spaces' (3 times the column length) at the end and as a result treating the column as CHAR(90 CHAR). When I execute the job all the rows are rejected with 'Inserted value too large, row rejected' error.

The job executes successfully when I 'Trim' the columns in the Transformer. I think this is bad as it has an adverse affect on the performance. Also, there are quite a few existing jobs and Trimming all the CHAR columns would be a cumbersome task.

I was able to insert the records manually into the destination table using SQL PLUS. That makes me think that the problem may be with DataStage.

Is there a setting at the server level which prevents adding spaces to the CHAR columns

Any help would be highly appreciated

Thanks
Yamini
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Research "derivation substitution" to reduce the cumbersome aspect.

I am not aware of any setting that changes the string pad character in server jobs - you could try setting APT_STRING_PADCHAR to 0x00 but I doubt that that would have any effect in server jobs.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

A little confused... the metadata in the job still declares these fields as Char(30) or you reimported from Oracle and they now show as Char(90)? :?

It has a tendency to do that. You just need to reset the size if that's the case.
-craig

"You can never have too many knives" -- Logan Nine Fingers
yaminids
Premium Member
Premium Member
Posts: 387
Joined: Mon Oct 18, 2004 1:04 pm

Post by yaminids »

Craig,

I have tried that option as well. I modified the size of the columns to Char(30) but the job still fails with 'Inserted value too large, row rejected' error.

The job completes successfully if I a) Trim the columns or b) Increase the size of the columns in 'Transformer'

Thanks
Yamini
yaminids
Premium Member
Premium Member
Posts: 387
Joined: Mon Oct 18, 2004 1:04 pm

Post by yaminids »

Ray,

Where can I set the parameter 'APT_STRING_PADCHAR'?

Thanks
Yamini
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

It's an environment variable. As noted, though, I doubt it would change your Server job's behaviour at all as it is a PX specific variable.
-craig

"You can never have too many knives" -- Logan Nine Fingers
yaminids
Premium Member
Premium Member
Posts: 387
Joined: Mon Oct 18, 2004 1:04 pm

Post by yaminids »

Is 'Trim' the only solution available?
Post Reply