Search found 13 matches

by FDW_CITI
Thu Jun 24, 2010 11:53 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Dynamic Metadata
Replies: 8
Views: 3471

Do users 1, 2 and 3 read different target tables? Or do you just have one target table but 3 different query requirements? ... The Target table is Single table with 3 different query requirements. The users 1, 2 & 3 may select any columns from col6 to col10 for loading the target table. The col...
by FDW_CITI
Thu Jun 24, 2010 5:58 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Dynamic Metadata
Replies: 8
Views: 3471

Dynamic Metadata

We have a requirement is as below: Table A: Col1 Col2 Col3 Col4 Col5 Col6 Col7 Col8 Col9 Col10 All users require Col1 to Col5 to be loaded to the target table. User1 need Col6 & Col7 also to be loaded in the target along with Col1 to Col5. This may change later as the user1 may require Col6, Col...
by FDW_CITI
Sun Apr 18, 2010 11:59 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: ds_ipcput() - row too big for inter stage rowbuffer
Replies: 4
Views: 3354

Hi, I am able to solve this issue by removing the Transformer stage in my Shared Container. I had used Transfomer stage to filter out a single file name fetched using the Folder stage. Now, I am able to read a single file using the Folder stage itself by passing the file name in wildcard property in...
by FDW_CITI
Fri Apr 16, 2010 5:58 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: ds_ipcput() - row too big for inter stage rowbuffer
Replies: 4
Views: 3354

Hi, I have already gone through the post. When I am trying to read the XML file (1600KB) in Server, it is working fine. The same refered in Parallel job (v8.1) as shared container is giving the error "ds_ipcput() - row too big for inter stage rowbuffer ". I tried turning off the row buffer...
by FDW_CITI
Fri Apr 16, 2010 4:38 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: ds_ipcput() - row too big for inter stage rowbuffer
Replies: 4
Views: 3354

ds_ipcput() - row too big for inter stage rowbuffer

Hi, I have created a server shared container for reading XML file. It contains the stages as follows: Folder stage -- > Transformer --> XML Input stage. I am using this shared container in my parallel job where I am getting the error "ds_ipcput() - row too big for inter stage rowbuffer". C...
by FDW_CITI
Tue Oct 21, 2008 12:56 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Reg: NULL values in comma delimited file
Replies: 15
Views: 8411

Thanks Nagaraj. The problem was due to mismatch in the datatype. We are getting a fixed with file and the file is read using schema file definition. The numeric column are read as int32 using schema file and in DataStage the data flow as Integer values. But in the final Oracle table, the columns are...
by FDW_CITI
Mon Oct 20, 2008 9:12 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Reg: NULL values in comma delimited file
Replies: 15
Views: 8411

Sorry... :(

I didn't force.. Just wanted to know whether the issue was solved or not
by FDW_CITI
Fri Oct 03, 2008 2:22 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Reg: NULL values in comma delimited file
Replies: 15
Views: 8411

Any answer for this issue.
by FDW_CITI
Fri Oct 03, 2008 1:07 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Oracle_DB: "null_field" length (5) must match fiel
Replies: 1
Views: 1481

Oracle_DB: "null_field" length (5) must match fiel

We are using the schema file as below: record {record_delim =' ', final_delim=none, delim=none} ( CTL1:nullable int32 {width=2, null_field=' '}; CTL2:nullable int32 {width=4, null_field=' '}; CTL3:nullable int32 {width=4, null_field=' '}; CTL4:nullable int32 {width=4, null_field=' '}; } While readin...
by FDW_CITI
Thu Oct 02, 2008 11:43 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Datastage Error
Replies: 6
Views: 4025

Yes, it is set ...

Please respond to my new post with subject "UNICODE for CHAR/ VARCHAR2 columns"
by FDW_CITI
Thu Oct 02, 2008 11:40 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: UNICODE for CHAR/ VARCHAR2 columns
Replies: 2
Views: 1810

UNICODE for CHAR/ VARCHAR2 columns

Can anyone explains what this UNICODE signifies in column definitions.

We have defined the NLS Lang as UTF8 and all column datatype is VARCHAR2. We are using Oracle database for loading the data

Do we need to define the extended type as UNICODE?

Regards,
by FDW_CITI
Thu Oct 02, 2008 9:03 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Datastage Error
Replies: 6
Views: 4025

UNICODE in Column Definition

Can anyone explains what this UNICODE signifies in column definitions.

We have defined the NLS Lang as UTF8 and all column datatype is VARCHAR2. We are using Oracle database for loading the data

Do we need to define the extended type as UNICODE?

Regards,
Manju