Page 1 of 1

CFF Record

Posted: Mon Jul 31, 2006 10:46 am
by Andet
Is there a maximum record size for CFF or sequential flat files? I haven't been ablt to find anything on record sized, just file sizes...

thanks,


Ande

Posted: Mon Jul 31, 2006 11:00 am
by samba
There is no limit on the size of the sequential file. You should be careful about the 'ulimit'. If you are dealing with a large sequential file, its always better to set the file size in 'ulimit' to 'Unlimited'

Posted: Mon Jul 31, 2006 4:46 pm
by Andet
My question concerned record size, not file size.
IE Would there be any problem processing a CFF record(COBOL) greater than 160500 bytes?

Thanks,

Ande

Record size

Posted: Tue Aug 01, 2006 7:31 am
by Lotus26
Hi
Even I did also got same type of file with record size 160446 bytes.
So I wonder will there be any problem while importing such kind of file or will there be any environment variable I need to include while importing.

If suppose any body has the information please share with us.

I appreciate your time.

Posted: Tue Aug 01, 2006 7:41 am
by ArndW
Since you have the data, why don't you just try to read & write it? You won't have any issues with record lengths of 160Kb!

record size

Posted: Tue Aug 01, 2006 9:33 am
by Lotus26
Hi

I tried it to import that file and it is failing saying that

main_program: Fatal Error: unknown block format [datamgr/datamgr.C:759]


There are some decimal fields in it and I am getting like

main_program: Fatal Error: Not a v1.1 type: decimal[4,0] [api/schema/schema.C:3869]



and also I am getting

ImportSP,0: Fatal Error: File data set, file "{0}".; output of "ImportSP": the record is too big to fit in a block; the length requested is: 160422. [api/dataset_rep1.C:3133]

Any suggestions on these errors ??
Thanks.

Posted: Wed May 16, 2007 5:56 am
by lfong
In the Administrator, set the option APT_AUTO_TRANSPORT_BLOCK_SIZE
to false.