Is there a way to Capture what Sequential files were Process

A forum for discussing DataStage<sup>®</sup> basics. If you're not sure where your question goes, start here.

Moderators: chulett, rschirm, roy

Post Reply
joefonseca79
Premium Member
Premium Member
Posts: 17
Joined: Mon Feb 18, 2008 7:56 pm
Location: Warwick, RI

Is there a way to Capture what Sequential files were Process

Post by joefonseca79 »

I have a csv files that need to be processed 3 times a day at 9:30, 1:30 and 2:30
They are *.DAT files.

The Souce / Read Method = File Pattern
The job is processing on 1 Node to not mix up the rows.

Example Files

Filename Date/Time
File1.DAT 05/31/11 08:02 AM
File2.DAT 05/31/11 08:12 AM
File3.DAT 05/31/11 08:23 AM
File4.DAT 05/31/11 08:27 AM

They all are processed and then appended to each other and output a file OMA.DAT

Is there a way to track or write to a file the name of the file that DataStage is processing for an audit trail?
jOE fOnSEca
arunkumarmm
Participant
Posts: 246
Joined: Mon Jun 30, 2008 3:22 am
Location: New York
Contact:

Post by arunkumarmm »

I'm not sure if there is a way to list the file names that were processed but you can use a 'ls' command with the same file pattern and print it to the log.
Arun
joefonseca79
Premium Member
Premium Member
Posts: 17
Joined: Mon Feb 18, 2008 7:56 pm
Location: Warwick, RI

Post by joefonseca79 »

Is LS command a unix / linux command? Or can you do that in seqence job?

-Joe
jOE fOnSEca
arunkumarmm
Participant
Posts: 246
Joined: Mon Jun 30, 2008 3:22 am
Location: New York
Contact:

Post by arunkumarmm »

Its a UNIX command to list the files in a specify path or current directory. And YES, you have the command activity in the job sequence to execute any command/script.
Arun
joefonseca79
Premium Member
Premium Member
Posts: 17
Joined: Mon Feb 18, 2008 7:56 pm
Location: Warwick, RI

Does this look right?

Post by joefonseca79 »

Here's what I put together but it's not writing anything to the file.

ls -a /data/DataStage/OPTICOM/SOURCE_FILES test.txt

and it's not writting to the file. Does that look wrong for a script?

-Joe
jOE fOnSEca
arunkumarmm
Participant
Posts: 246
Joined: Mon Jun 30, 2008 3:22 am
Location: New York
Contact:

Post by arunkumarmm »

Well, you are not trying to write it to a output file in your command. Try this:

ls -a /data/DataStage/OPTICOM/SRC_FilePattern* > /data/DataStage/OPTICOM/FileNames.txt
Arun
joefonseca79
Premium Member
Premium Member
Posts: 17
Joined: Mon Feb 18, 2008 7:56 pm
Location: Warwick, RI

Post by joefonseca79 »

Great thanks for the example. I got this going. One last ? do you know how to make the file that I'm writing to dynamic to add date time in file name?
jOE fOnSEca
arunkumarmm
Participant
Posts: 246
Joined: Mon Jun 30, 2008 3:22 am
Location: New York
Contact:

Post by arunkumarmm »

Add a parameter to you job, and pass the value (job start date or current timestamp) from the sequence and append the parameter to file name as required.
Arun
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Or use the date command surrounded by backquotes as part of the file name.

Code: Select all

ls -a /data/DataStage/OPTICOM/SRC_FilePattern* > /data/DataStage/OPTICOM/FileNames_`date +"%Y%m%d"`.txt
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
joefonseca79
Premium Member
Premium Member
Posts: 17
Joined: Mon Feb 18, 2008 7:56 pm
Location: Warwick, RI

I got it i think

Post by joefonseca79 »

Here's the code that I did. it reads the contents of the folder and then writes that to LOGMMDDYYYYHHMM.

Then it takes all the files in the folder with *.DAT and MOVES them to the BACKUP folder.

-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=

Code: Select all

clear
echo "OPTICOM BACKUP START"
echo " "
echo " "
echo " " 
ls -all /data/DataStage/OPTICOM/SOURCE_FILES/*.* > /data/DataStage/OPTICOM/SOURCE_FILES/BACKUP/LOG.`date '+%m%d%Y-%H%M'`.txt
echo " "
echo " "
echo  "BATCH MOVE OF FILES TO BACKUP FOLDER"
echo " "
echo " "
mv /data/DataStage/OPTICOM/SOURCE_FILES/*.DAT /data/DataStage/OPTICOM/SOURCE_FILES/BACKUP
echo " "
echo " "
echo " "
echo " "
echo " "
echo " "
echo "OPTICOM BACKUP COMPLETE"
echo " "
echo " "
jOE fOnSEca
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Hopefully, nothing new has shown up between the time you processed them and you ran this.
-craig

"You can never have too many knives" -- Logan Nine Fingers
joefonseca79
Premium Member
Premium Member
Posts: 17
Joined: Mon Feb 18, 2008 7:56 pm
Location: Warwick, RI

Post by joefonseca79 »

I have Datastage that picks up all the files and processes them. Then it sends the file via email. Then the unix script goes and makes the file and moves them to the backup drive.

Orders are sent to us every 15 minutes. So I think I'll be good.
jOE fOnSEca
Post Reply