Hi:
Is it possible to create for each row one output csv file with one record?
Thanks,
Praveen
Search found 102 matches
- Tue Oct 04, 2011 9:21 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: For each row one output file
- Replies: 2
- Views: 1674
- Wed May 25, 2011 11:54 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Entire text file as a single Message to MQ
- Replies: 3
- Views: 2148
- Wed May 25, 2011 8:42 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Entire text file as a single Message to MQ
- Replies: 3
- Views: 2148
Entire text file as a single Message to MQ
Hi,
Can any one have done before the entire text file as a single message to MQ?If so, Please let me know what would be the option we have to set in MQ Connector stage.
Thanks In advance,
Praveen.
Can any one have done before the entire text file as a single message to MQ?If so, Please let me know what would be the option we have to set in MQ Connector stage.
Thanks In advance,
Praveen.
- Wed May 18, 2011 10:10 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Datastage jobs not running and validating
- Replies: 5
- Views: 3608
- Wed May 18, 2011 5:03 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Datastage jobs not running and validating
- Replies: 5
- Views: 3608
- Tue May 17, 2011 2:14 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Datastage jobs not running and validating
- Replies: 5
- Views: 3608
Datastage jobs not running and validating
Hi,
I have installed the Datastage 8.0.After successfull instalation I tried to run sample job(rowGen->Peek) .It compiled successfully but not running and validating.
If you any ideas to resolve this issue please let me know.
Thansk inadvance,
Praveen.
I have installed the Datastage 8.0.After successfull instalation I tried to run sample job(rowGen->Peek) .It compiled successfully but not running and validating.
If you any ideas to resolve this issue please let me know.
Thansk inadvance,
Praveen.
- Wed Jun 24, 2009 9:59 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: the record is too big to fit into block
- Replies: 22
- Views: 24146
- Wed Jun 24, 2009 9:55 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: the record is too big to fit into block
- Replies: 22
- Views: 24146
- Wed Jun 24, 2009 9:27 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: the record is too big to fit into block
- Replies: 22
- Views: 24146
- Wed Jun 24, 2009 9:25 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: the record is too big to fit into block
- Replies: 22
- Views: 24146
- Wed Jun 24, 2009 5:26 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: the record is too big to fit into block
- Replies: 22
- Views: 24146
Hi, I have resolved issue some what.But I have issue with single record. my toatl number of records id 224400.In this only one record is giving problem because this records is bigger five times then all other.But the record I can able to write into sequential file but i am not able to write into Dat...
- Tue Jun 23, 2009 7:42 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: the record is too big to fit into block
- Replies: 22
- Views: 24146
There are other APT variables that may help, check this post to see if anything there works for you. ... APT_AUTO_TRANSPORT_BLOCK_SIZE = False APT_DEFAULT_TRANSPORT_BLOCK_SIZE = 400000 APT_MAX_TRANSPORT_BLOCK_SIZE = 1048576 APT_LATENCY_COEFFICIENT = 5 I have set all the variables in Project level B...
- Tue Jun 23, 2009 7:33 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: the record is too big to fit into block
- Replies: 22
- Views: 24146
- Tue Jun 23, 2009 7:22 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: the record is too big to fit into block
- Replies: 22
- Views: 24146
If you are getting the same error message, then obviously the change isn't being accepted. Can you check the value in the director log for your run? ... I checked the logs in the datastage.It is taking what ever values we are passing.But when it reaches the 22389 record count it is failing the job ...
- Tue Jun 23, 2009 7:07 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: the record is too big to fit into block
- Replies: 22
- Views: 24146