Search found 102 matches

by thumati.praveen
Tue Oct 04, 2011 9:21 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: For each row one output file
Replies: 2
Views: 1674

For each row one output file

Hi:

Is it possible to create for each row one output csv file with one record?

Thanks,
Praveen
by thumati.praveen
Wed May 25, 2011 11:54 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Entire text file as a single Message to MQ
Replies: 3
Views: 2148

Thanks for the suggetion.Is there any way to do in parallel job?

Thanks Again,
Praveen.
by thumati.praveen
Wed May 25, 2011 8:42 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Entire text file as a single Message to MQ
Replies: 3
Views: 2148

Entire text file as a single Message to MQ

Hi,

Can any one have done before the entire text file as a single message to MQ?If so, Please let me know what would be the option we have to set in MQ Connector stage.

Thanks In advance,
Praveen.
by thumati.praveen
Wed May 18, 2011 10:10 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Datastage jobs not running and validating
Replies: 5
Views: 3608

Hi:

The same software we had installed in other environments.It is working fine.

Thanks
Praveen.
by thumati.praveen
Wed May 18, 2011 5:03 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Datastage jobs not running and validating
Replies: 5
Views: 3608

Hi,

There is no log in the DS director to analysie the problem.The job in compile status only.Can you tel me when the job will not run even it is successfully compiled?

Thanks inadvance,
Praveen.
by thumati.praveen
Tue May 17, 2011 2:14 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Datastage jobs not running and validating
Replies: 5
Views: 3608

Datastage jobs not running and validating

Hi,

I have installed the Datastage 8.0.After successfull instalation I tried to run sample job(rowGen->Peek) .It compiled successfully but not running and validating.

If you any ideas to resolve this issue please let me know.

Thansk inadvance,
Praveen.
by thumati.praveen
Wed Jun 24, 2009 9:59 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: the record is too big to fit into block
Replies: 22
Views: 24146

Hi,

Can I resolve this issue by converting the data into binary format by using the column export stage?

Thanks,
Praveen.
by thumati.praveen
Wed Jun 24, 2009 9:55 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: the record is too big to fit into block
Replies: 22
Views: 24146

Hi,

Can I resolve this issue by converting the data into binary format by using the column export stage?

Thanks,
Praveen.
by thumati.praveen
Wed Jun 24, 2009 9:27 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: the record is too big to fit into block
Replies: 22
Views: 24146

How did you "resolve the issue somewhat"? Have you made changes which resolved it for the most part (as more than one record was causing a problem) or are you saying that there was only ever an issue with this single record? It is aborting due to not a single record.But for some couple of...
by thumati.praveen
Wed Jun 24, 2009 9:25 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: the record is too big to fit into block
Replies: 22
Views: 24146

miwinter wrote:How did you "resolve the issue somewhat"? Have you made changes which resolved it for the most part (as more than one record was causing a problem) or are you saying that there was only ever an issue with this single record?
Finding the couse of issue means some what resolve.
by thumati.praveen
Wed Jun 24, 2009 5:26 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: the record is too big to fit into block
Replies: 22
Views: 24146

Hi, I have resolved issue some what.But I have issue with single record. my toatl number of records id 224400.In this only one record is giving problem because this records is bigger five times then all other.But the record I can able to write into sequential file but i am not able to write into Dat...
by thumati.praveen
Tue Jun 23, 2009 7:42 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: the record is too big to fit into block
Replies: 22
Views: 24146

There are other APT variables that may help, check this post to see if anything there works for you. ... APT_AUTO_TRANSPORT_BLOCK_SIZE = False APT_DEFAULT_TRANSPORT_BLOCK_SIZE = 400000 APT_MAX_TRANSPORT_BLOCK_SIZE = 1048576 APT_LATENCY_COEFFICIENT = 5 I have set all the variables in Project level B...
by thumati.praveen
Tue Jun 23, 2009 7:33 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: the record is too big to fit into block
Replies: 22
Views: 24146

Just how same is the "same error" you get after changing the APT variable? Exactly the same as in it still says "the max block length is: 131072" or just that it is still "too big" but you see ... It always gives below error: APT_CombinedOperatorController(4),1: Fatal ...
by thumati.praveen
Tue Jun 23, 2009 7:22 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: the record is too big to fit into block
Replies: 22
Views: 24146

If you are getting the same error message, then obviously the change isn't being accepted. Can you check the value in the director log for your run? ... I checked the logs in the datastage.It is taking what ever values we are passing.But when it reaches the 22389 record count it is failing the job ...
by thumati.praveen
Tue Jun 23, 2009 7:07 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: the record is too big to fit into block
Replies: 22
Views: 24146

My input reocord is combination of xml chunks.

So many xml chunks I am taking as a single column in my input row.

Thanks,
Praveen.