We have 2 situations in our project where these are causing severe problems.
We have DS 8.1. The config file has 3 nodes and all are having different fastnames.
When we are running a job havin 5 million records then it is getting aborted by giving an error that says, unable to write on (fd 3), can not write to node 3, insufficient space.
even though we are having 200 Gigs.....alloted.
Second problem is we deleted datasets from the folder directly but in multi node i think it delets only the dataset descriptor and not the actual dataset.
Hence even though we've deleted the datasets from the folder, still the huge datasets are visinle in /node1/res folder. That means what we deleted was just a pointer to actual dataset and not the actual dataset. Can anyone suggest how to delete these actual datasets from /node1/res folder.
Problem with datasets
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 221
- Joined: Fri Feb 17, 2006 3:38 am
- Location: India
- Contact:
Problem with datasets
Thanks & Regards
Parag Saundattikar
Certified for Infosphere DataStage v8.0
Parag Saundattikar
Certified for Infosphere DataStage v8.0
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
-
- Participant
- Posts: 3337
- Joined: Mon Jan 17, 2005 4:49 am
- Location: United Kingdom