deleting /tmp/* caused "score file deleted" warnin

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

You deleted ones in use by a the currently running job and it logged the fact that it couldn't find it when it went to remove it. Just... don't do that.
-craig

"You can never have too many knives" -- Logan Nine Fingers
tbtcust
Premium Member
Premium Member
Posts: 230
Joined: Tue Mar 04, 2008 9:07 am

Post by tbtcust »

Thanks chulett.

1) The jobs continue to throw this warning. How can I get the jobs from throwing this warning? Can I just restore the files?
2) What are these files used for?
3) Can I redirect what drive and folder the files are written to. There were about 10 gigs worth when I deleted them.
4) Is there a way to tie the files to the jobs? We have lots of jobs that are no longer valid.

Thanks in advance for your help
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

1) Because you still have jobs running, I assume. If you restore the ones for the jobs that have not completed yet, I would assume the error would not be generated.

2) Temporary stuff, whatever temp data the process needs to accumulate while the job runs, I would imagine. Honestly can't say other than that.

3) I'm not sure, I believe it defaults to whatever you have set as UVTEMP in the uvconfig file, but believe you can override that in your config file. Someone else will need to clarify.

4) I doubt it but again don't know. Best to clean things like "/tmp" by the age of the file, if you need to do it manually. Typically this is an automatic thing the SA's setup.
-craig

"You can never have too many knives" -- Logan Nine Fingers
tbtcust
Premium Member
Premium Member
Posts: 230
Joined: Tue Mar 04, 2008 9:07 am

Post by tbtcust »

Thanks chulett. This is very helpful
Oritech
Premium Member
Premium Member
Posts: 140
Joined: Thu May 07, 2009 9:32 pm

Post by Oritech »

I am the new premium member on this fourm....

I am also getting this error in each job which runs.

And finding heaps of files accumulated at tmp space out which older files, we are deleting manually to release space....

thro' UVTEMP in the uvconfig file,we can specify the path for tmp files not the automatic deletion of files...

How to automate the deletion?
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

As noted, that's typically the purview of whomever administers your system. Either the O/S will have something built in or a cron script will run (typically) once a day to clean out files in "temp" locations over X days old. It's really not something anyone not "in authority" should be doing. Talk to your SysAdmins if it is an issue.

One possible answer is to create "temp" space specific to your ETL processes, somewhere with enough space where you have the credentials to maintain them. Then you can whack whatever needs whacking when it's whacking time without fear of reprisal. Unless you over-whack, of course. :wink:
-craig

"You can never have too many knives" -- Logan Nine Fingers
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Whacko!
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

<whack!>
-craig

"You can never have too many knives" -- Logan Nine Fingers
Oritech
Premium Member
Premium Member
Posts: 140
Joined: Thu May 07, 2009 9:32 pm

Post by Oritech »

Really Whacko! thanks 8)
asorrell
Posts: 1707
Joined: Fri Apr 04, 2003 2:00 pm
Location: Colleyville, Texas

Post by asorrell »

Most UNIX systems automatically clean up /tmp on re-boot, which is good enough for a lot of systems (if you boot moderately often). If you re-boot your Windows system often enough - add a cleanup script to the end of your boot process - before any DataStage jobs start!
Andy Sorrell
Certified DataStage Consultant
IBM Analytics Champion 2009 - 2020
Oritech
Premium Member
Premium Member
Posts: 140
Joined: Thu May 07, 2009 9:32 pm

Post by Oritech »

How to add clean scripts?
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

That's definitely not a DataStage question. Consult your system administrator for assistance with automatic start and shutdown scripts.

If you want to amend the DataStage startup/shutdown script, you will find this in the location reported by the uv -admin -info command. Make very sure you take a backup copy of this script before modifying it!
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply