Problem with datasets

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
parag.s.27
Participant
Posts: 221
Joined: Fri Feb 17, 2006 3:38 am
Location: India
Contact:

Problem with datasets

Post by parag.s.27 »

We have 2 situations in our project where these are causing severe problems.

We have DS 8.1. The config file has 3 nodes and all are having different fastnames.

When we are running a job havin 5 million records then it is getting aborted by giving an error that says, unable to write on (fd 3), can not write to node 3, insufficient space.

even though we are having 200 Gigs.....alloted.

Second problem is we deleted datasets from the folder directly but in multi node i think it delets only the dataset descriptor and not the actual dataset.

Hence even though we've deleted the datasets from the folder, still the huge datasets are visinle in /node1/res folder. That means what we deleted was just a pointer to actual dataset and not the actual dataset. Can anyone suggest how to delete these actual datasets from /node1/res folder.
Thanks & Regards
Parag Saundattikar
Certified for Infosphere DataStage v8.0
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Can you please post your configuration file?
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Sainath.Srinivasan
Participant
Posts: 3337
Joined: Mon Jan 17, 2005 4:49 am
Location: United Kingdom

Post by Sainath.Srinivasan »

Did you check the ulimit ?

How are you trying to remove the datasets ? i.e. what command(s) ?
Post Reply