Page 1 of 1

Is there a way to Capture what Sequential files were Process

Posted: Tue May 31, 2011 10:23 am
by joefonseca79
I have a csv files that need to be processed 3 times a day at 9:30, 1:30 and 2:30
They are *.DAT files.

The Souce / Read Method = File Pattern
The job is processing on 1 Node to not mix up the rows.

Example Files

Filename Date/Time
File1.DAT 05/31/11 08:02 AM
File2.DAT 05/31/11 08:12 AM
File3.DAT 05/31/11 08:23 AM
File4.DAT 05/31/11 08:27 AM

They all are processed and then appended to each other and output a file OMA.DAT

Is there a way to track or write to a file the name of the file that DataStage is processing for an audit trail?

Posted: Tue May 31, 2011 10:38 am
by arunkumarmm
I'm not sure if there is a way to list the file names that were processed but you can use a 'ls' command with the same file pattern and print it to the log.

Posted: Tue May 31, 2011 11:17 am
by joefonseca79
Is LS command a unix / linux command? Or can you do that in seqence job?

-Joe

Posted: Tue May 31, 2011 1:23 pm
by arunkumarmm
Its a UNIX command to list the files in a specify path or current directory. And YES, you have the command activity in the job sequence to execute any command/script.

Does this look right?

Posted: Tue May 31, 2011 1:43 pm
by joefonseca79
Here's what I put together but it's not writing anything to the file.

ls -a /data/DataStage/OPTICOM/SOURCE_FILES test.txt

and it's not writting to the file. Does that look wrong for a script?

-Joe

Posted: Tue May 31, 2011 1:52 pm
by arunkumarmm
Well, you are not trying to write it to a output file in your command. Try this:

ls -a /data/DataStage/OPTICOM/SRC_FilePattern* > /data/DataStage/OPTICOM/FileNames.txt

Posted: Tue May 31, 2011 3:05 pm
by joefonseca79
Great thanks for the example. I got this going. One last ? do you know how to make the file that I'm writing to dynamic to add date time in file name?

Posted: Tue May 31, 2011 3:23 pm
by arunkumarmm
Add a parameter to you job, and pass the value (job start date or current timestamp) from the sequence and append the parameter to file name as required.

Posted: Tue May 31, 2011 5:06 pm
by ray.wurlod
Or use the date command surrounded by backquotes as part of the file name.

Code: Select all

ls -a /data/DataStage/OPTICOM/SRC_FilePattern* > /data/DataStage/OPTICOM/FileNames_`date +"%Y%m%d"`.txt

I got it i think

Posted: Wed Jun 01, 2011 8:02 am
by joefonseca79
Here's the code that I did. it reads the contents of the folder and then writes that to LOGMMDDYYYYHHMM.

Then it takes all the files in the folder with *.DAT and MOVES them to the BACKUP folder.

-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=

Code: Select all

clear
echo "OPTICOM BACKUP START"
echo " "
echo " "
echo " " 
ls -all /data/DataStage/OPTICOM/SOURCE_FILES/*.* > /data/DataStage/OPTICOM/SOURCE_FILES/BACKUP/LOG.`date '+%m%d%Y-%H%M'`.txt
echo " "
echo " "
echo  "BATCH MOVE OF FILES TO BACKUP FOLDER"
echo " "
echo " "
mv /data/DataStage/OPTICOM/SOURCE_FILES/*.DAT /data/DataStage/OPTICOM/SOURCE_FILES/BACKUP
echo " "
echo " "
echo " "
echo " "
echo " "
echo " "
echo "OPTICOM BACKUP COMPLETE"
echo " "
echo " "

Posted: Wed Jun 01, 2011 9:00 am
by chulett
Hopefully, nothing new has shown up between the time you processed them and you ran this.

Posted: Wed Jun 01, 2011 9:27 am
by joefonseca79
I have Datastage that picks up all the files and processes them. Then it sends the file via email. Then the unix script goes and makes the file and moves them to the backup drive.

Orders are sent to us every 15 minutes. So I think I'll be good.