Delete all Datasets from folder Datasets
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 136
- Joined: Wed May 07, 2008 11:26 am
- Location: Sydney, Australia
- Contact:
Delete all Datasets from folder Datasets
Hi Experts,
I want to delete all the existing datasets. all the datasets are created in the Homepath in a folder called $DSHOME/Datasets.
CAN i delete all the datasets by using the unix command rm *.* in the folder $DSHOME/Datasets ?
Actually i want to create another folder in another path for datasets and mention that path in the config file and create all datasets afresh when the jobs run again.
regards,
I want to delete all the existing datasets. all the datasets are created in the Homepath in a folder called $DSHOME/Datasets.
CAN i delete all the datasets by using the unix command rm *.* in the folder $DSHOME/Datasets ?
Actually i want to create another folder in another path for datasets and mention that path in the config file and create all datasets afresh when the jobs run again.
regards,
Vinay
-
- Premium Member
- Posts: 1735
- Joined: Thu Mar 01, 2007 5:44 am
- Location: Troy, MI
you can. Even if you don't delete the descriptor file *.ds it will work. But its better to delete descriptor files also to avoid strange errors.
I did deleted binary files without deleting the descriptor and it worked just fine (probably due to overwrite mode).
I haven't tried but it may create problems while using append mode.
Hence its better if you delete *.ds files also.
I did deleted binary files without deleting the descriptor and it worked just fine (probably due to overwrite mode).
I haven't tried but it may create problems while using append mode.
Hence its better if you delete *.ds files also.
Priyadarshi Kunal
Genius may have its limitations, but stupidity is not thus handicapped.
Genius may have its limitations, but stupidity is not thus handicapped.
-
- Participant
- Posts: 3337
- Joined: Mon Jan 17, 2005 4:49 am
- Location: United Kingdom
Even though you can, it will leave a mess around - especially if you try to read any.
So best is to use "orchadmin rm".
A simple
wil do the trick.
So best is to use "orchadmin rm".
A simple
Code: Select all
for dsName in `ls -1 *.ds`
do
orchadmin rm $dsName
done
If you do it properly, as noted delete all datasets and all control files, there would be no "mess" left. And there's no magic to deleting the files, all orchadmin will just do an "rm" as well. The only thing about orchadmin is it is more smarter than us and can read the control file and thus knows exactly where all of the dataset files are for each control file.
So use orchadim for individual control files and their matching dataset files but for what the original poster stated as their need - "I want to delete all datasets" - you can just take off and nuke them from orbit. It's the only way to be sure.
So use orchadim for individual control files and their matching dataset files but for what the original poster stated as their need - "I want to delete all datasets" - you can just take off and nuke them from orbit. It's the only way to be sure.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
-
- Participant
- Posts: 136
- Joined: Wed May 07, 2008 11:26 am
- Location: Sydney, Australia
- Contact:
hi,
Actually i want to know if i could delete the whole folder at once instead of deleting datasets individually to save time. I need to create a fresh folder for datasets in another path.
Also i am facing problem in sourcing the dsenv file with following error.
-------------------------------------------------------------------------------------
# cd DSEngine
# . ./dsenv
ksh: t/datastage/Ascential/DataStage/DSEngine/bin:/dsetlsoft/datastage/Ascential/DataStage/PXEngine/bin:/oracle/app/product/10.1.0/bin: not found.
# pwd
/dsetlsoft/datastage/Ascential/DataStage/DSEngine
#
------------------------------------------------------------------------------------
Actually i want to know if i could delete the whole folder at once instead of deleting datasets individually to save time. I need to create a fresh folder for datasets in another path.
Also i am facing problem in sourcing the dsenv file with following error.
-------------------------------------------------------------------------------------
# cd DSEngine
# . ./dsenv
ksh: t/datastage/Ascential/DataStage/DSEngine/bin:/dsetlsoft/datastage/Ascential/DataStage/PXEngine/bin:/oracle/app/product/10.1.0/bin: not found.
# pwd
/dsetlsoft/datastage/Ascential/DataStage/DSEngine
#
------------------------------------------------------------------------------------
Vinay
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
NOvintipa wrote:Actually i want to know if i could delete the whole folder at once instead of deleting datasets individually to save time. I need to create a fresh folder for datasets in another path.
Data Sets' data files live in multiple directories.
If you delete all the *.ds files (which is what I am guessing is your intent) you leave many orphan Data Set data files taking up space on your system.
The correct approach is to create a script that cycles through the "*.ds" file names and invokes orchadmin to delete the Data Set for which each is the descriptor file.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
I'll chime in as well - don't delete the .ds files with "rm". You can cd to your directory with dataset descriptors and issue if you'd rather not do an explicit loop.
Code: Select all
find . -name *.ds -exec orchadmin rm {} \;
<a href=http://www.worldcommunitygrid.org/team/ ... TZ9H4CGVP1 target="WCGWin">
</a>
</a>
-
- Premium Member
- Posts: 1735
- Joined: Thu Mar 01, 2007 5:44 am
- Location: Troy, MI
If you look in to descriptor file last few lines contains the name and path of the datasets' binary files.
Orchadmin reads the descriptor, deletes all binary files associated with it and deletes the descriptor itself.
If you have to delete the folder of binary files and also delete all the descriptor files its ok to use rm command.
Although as mentioned by Arnd, its easy (not easier) to use orchadmin and can be used (atleast it is the formal way). But you can use rm without any problem in this case (easier).
Orchadmin reads the descriptor, deletes all binary files associated with it and deletes the descriptor itself.
If you have to delete the folder of binary files and also delete all the descriptor files its ok to use rm command.
Although as mentioned by Arnd, its easy (not easier) to use orchadmin and can be used (atleast it is the formal way). But you can use rm without any problem in this case (easier).
Priyadarshi Kunal
Genius may have its limitations, but stupidity is not thus handicapped.
Genius may have its limitations, but stupidity is not thus handicapped.
-
- Participant
- Posts: 3337
- Joined: Mon Jan 17, 2005 4:49 am
- Location: United Kingdom
Priyadarshi,
It is enough to delete the .ds files alone as the child will become orphans (even though it will consume space).
The point myself, Ray and ArndW are pointing is w.r.t the limited knowledge of OP and guiding through rm outside the tool may result in some other damage.
Also it is possible that the filesets and lookup tables create objects, which may be deleted by rm.
It is enough to delete the .ds files alone as the child will become orphans (even though it will consume space).
The point myself, Ray and ArndW are pointing is w.r.t the limited knowledge of OP and guiding through rm outside the tool may result in some other damage.
Also it is possible that the filesets and lookup tables create objects, which may be deleted by rm.
Jeez, what a bunch of anal little monkeys.
It's not about deleting just the ".ds" files, but if you actually read the starting post rather than just chime in midstream you'll see it's about deleting all datasets (which are stored in one location) so they can be recreated fresh in a new location next job run. Hence my comments. And no-one here advocated simply deleting just the .ds files.
It's not about deleting just the ".ds" files, but if you actually read the starting post rather than just chime in midstream you'll see it's about deleting all datasets (which are stored in one location) so they can be recreated fresh in a new location next job run. Hence my comments. And no-one here advocated simply deleting just the .ds files.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers