Search found 13 matches
- Thu Jun 24, 2010 11:53 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Dynamic Metadata
- Replies: 8
- Views: 3471
- Thu Jun 24, 2010 5:58 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Dynamic Metadata
- Replies: 8
- Views: 3471
Dynamic Metadata
We have a requirement is as below: Table A: Col1 Col2 Col3 Col4 Col5 Col6 Col7 Col8 Col9 Col10 All users require Col1 to Col5 to be loaded to the target table. User1 need Col6 & Col7 also to be loaded in the target along with Col1 to Col5. This may change later as the user1 may require Col6, Col...
- Sun Apr 18, 2010 11:59 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: ds_ipcput() - row too big for inter stage rowbuffer
- Replies: 4
- Views: 3354
Hi, I am able to solve this issue by removing the Transformer stage in my Shared Container. I had used Transfomer stage to filter out a single file name fetched using the Folder stage. Now, I am able to read a single file using the Folder stage itself by passing the file name in wildcard property in...
- Fri Apr 16, 2010 5:58 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: ds_ipcput() - row too big for inter stage rowbuffer
- Replies: 4
- Views: 3354
- Fri Apr 16, 2010 4:38 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: ds_ipcput() - row too big for inter stage rowbuffer
- Replies: 4
- Views: 3354
ds_ipcput() - row too big for inter stage rowbuffer
Hi, I have created a server shared container for reading XML file. It contains the stages as follows: Folder stage -- > Transformer --> XML Input stage. I am using this shared container in my parallel job where I am getting the error "ds_ipcput() - row too big for inter stage rowbuffer". C...
- Tue Oct 21, 2008 12:56 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Reg: NULL values in comma delimited file
- Replies: 15
- Views: 8411
Thanks Nagaraj. The problem was due to mismatch in the datatype. We are getting a fixed with file and the file is read using schema file definition. The numeric column are read as int32 using schema file and in DataStage the data flow as Integer values. But in the final Oracle table, the columns are...
- Mon Oct 20, 2008 9:12 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Reg: NULL values in comma delimited file
- Replies: 15
- Views: 8411
- Fri Oct 03, 2008 2:22 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Reg: NULL values in comma delimited file
- Replies: 15
- Views: 8411
- Fri Oct 03, 2008 1:07 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Oracle_DB: "null_field" length (5) must match fiel
- Replies: 1
- Views: 1481
Oracle_DB: "null_field" length (5) must match fiel
We are using the schema file as below: record {record_delim =' ', final_delim=none, delim=none} ( CTL1:nullable int32 {width=2, null_field=' '}; CTL2:nullable int32 {width=4, null_field=' '}; CTL3:nullable int32 {width=4, null_field=' '}; CTL4:nullable int32 {width=4, null_field=' '}; } While readin...
- Fri Oct 03, 2008 1:01 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: UNICODE for CHAR/ VARCHAR2 columns
- Replies: 2
- Views: 1810
- Thu Oct 02, 2008 11:43 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Datastage Error
- Replies: 6
- Views: 4025
- Thu Oct 02, 2008 11:40 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: UNICODE for CHAR/ VARCHAR2 columns
- Replies: 2
- Views: 1810
UNICODE for CHAR/ VARCHAR2 columns
Can anyone explains what this UNICODE signifies in column definitions.
We have defined the NLS Lang as UTF8 and all column datatype is VARCHAR2. We are using Oracle database for loading the data
Do we need to define the extended type as UNICODE?
Regards,
We have defined the NLS Lang as UTF8 and all column datatype is VARCHAR2. We are using Oracle database for loading the data
Do we need to define the extended type as UNICODE?
Regards,
- Thu Oct 02, 2008 9:03 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Datastage Error
- Replies: 6
- Views: 4025
UNICODE in Column Definition
Can anyone explains what this UNICODE signifies in column definitions.
We have defined the NLS Lang as UTF8 and all column datatype is VARCHAR2. We are using Oracle database for loading the data
Do we need to define the extended type as UNICODE?
Regards,
Manju
We have defined the NLS Lang as UTF8 and all column datatype is VARCHAR2. We are using Oracle database for loading the data
Do we need to define the extended type as UNICODE?
Regards,
Manju