CFF Record

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
Andet
Charter Member
Charter Member
Posts: 63
Joined: Mon Nov 01, 2004 9:40 am
Location: Clayton, MO

CFF Record

Post by Andet »

Is there a maximum record size for CFF or sequential flat files? I haven't been ablt to find anything on record sized, just file sizes...

thanks,


Ande
samba
Premium Member
Premium Member
Posts: 62
Joined: Wed Dec 07, 2005 11:44 am

Post by samba »

There is no limit on the size of the sequential file. You should be careful about the 'ulimit'. If you are dealing with a large sequential file, its always better to set the file size in 'ulimit' to 'Unlimited'
samba
Andet
Charter Member
Charter Member
Posts: 63
Joined: Mon Nov 01, 2004 9:40 am
Location: Clayton, MO

Post by Andet »

My question concerned record size, not file size.
IE Would there be any problem processing a CFF record(COBOL) greater than 160500 bytes?

Thanks,

Ande
Lotus26
Premium Member
Premium Member
Posts: 48
Joined: Tue Jul 13, 2004 2:09 pm

Record size

Post by Lotus26 »

Hi
Even I did also got same type of file with record size 160446 bytes.
So I wonder will there be any problem while importing such kind of file or will there be any environment variable I need to include while importing.

If suppose any body has the information please share with us.

I appreciate your time.
Regards
Lotus26
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Since you have the data, why don't you just try to read & write it? You won't have any issues with record lengths of 160Kb!
Lotus26
Premium Member
Premium Member
Posts: 48
Joined: Tue Jul 13, 2004 2:09 pm

record size

Post by Lotus26 »

Hi

I tried it to import that file and it is failing saying that

main_program: Fatal Error: unknown block format [datamgr/datamgr.C:759]


There are some decimal fields in it and I am getting like

main_program: Fatal Error: Not a v1.1 type: decimal[4,0] [api/schema/schema.C:3869]



and also I am getting

ImportSP,0: Fatal Error: File data set, file "{0}".; output of "ImportSP": the record is too big to fit in a block; the length requested is: 160422. [api/dataset_rep1.C:3133]

Any suggestions on these errors ??
Thanks.
Regards
Lotus26
lfong
Premium Member
Premium Member
Posts: 42
Joined: Fri Sep 09, 2005 7:48 am

Post by lfong »

In the Administrator, set the option APT_AUTO_TRANSPORT_BLOCK_SIZE
to false.
Post Reply