Page 1 of 1

datasets

Posted: Wed Oct 13, 2004 9:47 am
by nag0143
does the dataset has any limitations on file size.
when I try to load 1400000 records into a dataset i get the following error



Write to dataset failed: File too large

Orchestrate was unable to write to any of the following files:

buffer(20),0: Fatal Error: waitForWriteSignal(): Premature EOF on node

Posted: Wed Oct 13, 2004 3:18 pm
by ray.wurlod
DataStage does not limit the size of a dataset, but your operating system may.

Does your operating system have a limit set on file size? If you are not running on a 64-bit-enabled file system, the maximum UNIX file size is 2GB.

What is your ulimit setting for file size? Type the command ulimit -a to find out. This is a per-user setting; your UNIX administrator can change it if it's the limiting factor. We tend to recommend "unlimited" as the correct value to use for the user ID under which DataStage jobs are run.

Beware also that your data ulimit may need to be made larger than default, particularly if you are using bulk loader technology.

Error: File too large

Posted: Sun Feb 20, 2005 10:17 pm
by Gazelle
Our PX job run via Designer fails with the same message: "...File too large".

The ulimit shows:
file(blocks) = unlimited

Our Sysadmin questioned whether we "connect" to the userid using "su" or "su - ". If the former, then it will inherit the profile of the original session.
BTW, AIX/JFS2 supports a file size of 16TB (JFS2 can support 4PB)!

I notice that our dsadm userid has a ulimit of:
file(blocks) = 2097151

Can anyone explain to me with how the PX client connects to unix to run a datastage job?
If we set the dsadm ulimit, will that fix the problem?

Posted: Tue Feb 22, 2005 2:52 pm
by T42
Actually, it is RSH that DataStage depends on. Do this:

In the before job routine, add a ExecSH - type ulimit -a

Run the job. See what the results are.