Page 1 of 1

Problem with datasets

Posted: Thu Dec 10, 2009 10:22 pm
by parag.s.27
We have 2 situations in our project where these are causing severe problems.

We have DS 8.1. The config file has 3 nodes and all are having different fastnames.

When we are running a job havin 5 million records then it is getting aborted by giving an error that says, unable to write on (fd 3), can not write to node 3, insufficient space.

even though we are having 200 Gigs.....alloted.

Second problem is we deleted datasets from the folder directly but in multi node i think it delets only the dataset descriptor and not the actual dataset.

Hence even though we've deleted the datasets from the folder, still the huge datasets are visinle in /node1/res folder. That means what we deleted was just a pointer to actual dataset and not the actual dataset. Can anyone suggest how to delete these actual datasets from /node1/res folder.

Posted: Thu Dec 10, 2009 11:18 pm
by ray.wurlod
Can you please post your configuration file?

Posted: Fri Dec 11, 2009 4:21 am
by Sainath.Srinivasan
Did you check the ulimit ?

How are you trying to remove the datasets ? i.e. what command(s) ?