Is there a way to Capture what Sequential files were Process
Moderators: chulett, rschirm, roy
-
- Premium Member
- Posts: 17
- Joined: Mon Feb 18, 2008 7:56 pm
- Location: Warwick, RI
Is there a way to Capture what Sequential files were Process
I have a csv files that need to be processed 3 times a day at 9:30, 1:30 and 2:30
They are *.DAT files.
The Souce / Read Method = File Pattern
The job is processing on 1 Node to not mix up the rows.
Example Files
Filename Date/Time
File1.DAT 05/31/11 08:02 AM
File2.DAT 05/31/11 08:12 AM
File3.DAT 05/31/11 08:23 AM
File4.DAT 05/31/11 08:27 AM
They all are processed and then appended to each other and output a file OMA.DAT
Is there a way to track or write to a file the name of the file that DataStage is processing for an audit trail?
They are *.DAT files.
The Souce / Read Method = File Pattern
The job is processing on 1 Node to not mix up the rows.
Example Files
Filename Date/Time
File1.DAT 05/31/11 08:02 AM
File2.DAT 05/31/11 08:12 AM
File3.DAT 05/31/11 08:23 AM
File4.DAT 05/31/11 08:27 AM
They all are processed and then appended to each other and output a file OMA.DAT
Is there a way to track or write to a file the name of the file that DataStage is processing for an audit trail?
jOE fOnSEca
-
- Participant
- Posts: 246
- Joined: Mon Jun 30, 2008 3:22 am
- Location: New York
- Contact:
-
- Premium Member
- Posts: 17
- Joined: Mon Feb 18, 2008 7:56 pm
- Location: Warwick, RI
-
- Participant
- Posts: 246
- Joined: Mon Jun 30, 2008 3:22 am
- Location: New York
- Contact:
-
- Premium Member
- Posts: 17
- Joined: Mon Feb 18, 2008 7:56 pm
- Location: Warwick, RI
Does this look right?
Here's what I put together but it's not writing anything to the file.
ls -a /data/DataStage/OPTICOM/SOURCE_FILES test.txt
and it's not writting to the file. Does that look wrong for a script?
-Joe
ls -a /data/DataStage/OPTICOM/SOURCE_FILES test.txt
and it's not writting to the file. Does that look wrong for a script?
-Joe
jOE fOnSEca
-
- Participant
- Posts: 246
- Joined: Mon Jun 30, 2008 3:22 am
- Location: New York
- Contact:
-
- Premium Member
- Posts: 17
- Joined: Mon Feb 18, 2008 7:56 pm
- Location: Warwick, RI
-
- Participant
- Posts: 246
- Joined: Mon Jun 30, 2008 3:22 am
- Location: New York
- Contact:
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Or use the date command surrounded by backquotes as part of the file name.
Code: Select all
ls -a /data/DataStage/OPTICOM/SRC_FilePattern* > /data/DataStage/OPTICOM/FileNames_`date +"%Y%m%d"`.txt
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Premium Member
- Posts: 17
- Joined: Mon Feb 18, 2008 7:56 pm
- Location: Warwick, RI
I got it i think
Here's the code that I did. it reads the contents of the folder and then writes that to LOGMMDDYYYYHHMM.
Then it takes all the files in the folder with *.DAT and MOVES them to the BACKUP folder.
-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
Then it takes all the files in the folder with *.DAT and MOVES them to the BACKUP folder.
-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
Code: Select all
clear
echo "OPTICOM BACKUP START"
echo " "
echo " "
echo " "
ls -all /data/DataStage/OPTICOM/SOURCE_FILES/*.* > /data/DataStage/OPTICOM/SOURCE_FILES/BACKUP/LOG.`date '+%m%d%Y-%H%M'`.txt
echo " "
echo " "
echo "BATCH MOVE OF FILES TO BACKUP FOLDER"
echo " "
echo " "
mv /data/DataStage/OPTICOM/SOURCE_FILES/*.DAT /data/DataStage/OPTICOM/SOURCE_FILES/BACKUP
echo " "
echo " "
echo " "
echo " "
echo " "
echo " "
echo "OPTICOM BACKUP COMPLETE"
echo " "
echo " "
jOE fOnSEca
-
- Premium Member
- Posts: 17
- Joined: Mon Feb 18, 2008 7:56 pm
- Location: Warwick, RI