the record is too big to fit into block
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 106
- Joined: Wed Oct 04, 2006 5:21 am
the record is too big to fit into block
Hi,
Can you aware of below problem earlier ?
APT_CombinedOperatorController(5),0: Fatal Error: File data set, file "/etl/IS/Datasets/ds_isap_irde020_am_gm_gbo_final.ds".; output of "cpyROw": the record is too big to fit in a block; the length requested is: 145247, the max block length is: 131072.
Parameters for the job is:
$APT_CONFIG_FILE = /opt/IBM/InformationServer/Server/Configurations/default.apt
$APT_RECORD_COUNTS = False
$APT_DUMP_SCORE = False
$APT_MONITOR_SIZE = 500000
$APT_MONITOR_TIME = 20
$APT_DEFAULT_TRANSPORT_BLOCK_SIZE = 78904532
My input record length is 4000033960.
Thanks,
Praveen.
Can you aware of below problem earlier ?
APT_CombinedOperatorController(5),0: Fatal Error: File data set, file "/etl/IS/Datasets/ds_isap_irde020_am_gm_gbo_final.ds".; output of "cpyROw": the record is too big to fit in a block; the length requested is: 145247, the max block length is: 131072.
Parameters for the job is:
$APT_CONFIG_FILE = /opt/IBM/InformationServer/Server/Configurations/default.apt
$APT_RECORD_COUNTS = False
$APT_DUMP_SCORE = False
$APT_MONITOR_SIZE = 500000
$APT_MONITOR_TIME = 20
$APT_DEFAULT_TRANSPORT_BLOCK_SIZE = 78904532
My input record length is 4000033960.
Thanks,
Praveen.
-
- Participant
- Posts: 106
- Joined: Wed Oct 04, 2006 5:21 am
-
- Participant
- Posts: 106
- Joined: Wed Oct 04, 2006 5:21 am
-
- Participant
- Posts: 106
- Joined: Wed Oct 04, 2006 5:21 am
Yes boss,You identified my actual problem.chulett wrote:Your input record length is over 4 billion bytes?
Could you please advise how to resolve the problem?
when ever small amount of records comming means less than 50000 records it is working if the records are comming more than 100000 I am facing this probel.
Thanks,
Praveen.
-
- Participant
- Posts: 106
- Joined: Wed Oct 04, 2006 5:21 am
-
- Participant
- Posts: 106
- Joined: Wed Oct 04, 2006 5:21 am
I checked the logs in the datastage.It is taking what ever values we are passing.But when it reaches the 22389 record count it is failing the job with above reason.ArndW wrote:If you are getting the same error message, then obviously the change isn't being accepted. Can you check the value in the director log for your run? ...
I am still afrid why it is not taking values from APT_DEFAULT_TRANSPORT_BLOCK _SIZE.
could you let me know is their any other settings I need to specify in the job?
I am assuming mainly the problem occures while writing the output dataset on the disk.
Thanks,
Praveen.
-
- Participant
- Posts: 106
- Joined: Wed Oct 04, 2006 5:21 am
It always gives below error:chulett wrote:Just how same is the "same error" you get after changing the APT variable? Exactly the same as in it still says "the max block length is: 131072" or just that it is still "too big" but you see ...
APT_CombinedOperatorController(4),1: Fatal Error: File data set, file "/etl/IS/Datasets/ds_isap_irde020_am_gm_gbo_final.ds".; output of "cpyROw": the record is too big to fit in a block; the length requested is: 145247, the max block length is: 131072.
what ever value you set to Environment variable the error is same.
Thanks,
Praveen.