Page 1 of 1

ftp

Posted: Thu Nov 23, 2006 11:48 am
by sainath
Hi
I have files in remote location. these files are inserted daily.
after that i run the ETL job to process these files.
I used ftp to get the files form remote location to datastage location.

but i want to process only NEW files daily.


IS THERE ANY FTP command which do .

please share your thoughts
thks

Posted: Thu Nov 23, 2006 12:01 pm
by narasimha
How do you identify the new files?
Do the files get overwritten everytime?
Does the file name remain the same or do you rename them with the datetime?

If you can have a fixed name for the file all the time, or you can have a script that renames the new file before you do the ftp then you will be fine.

Posted: Thu Nov 23, 2006 12:30 pm
by sainath
Hi
1.My idea is to store the processed files in hash file and comepare with input files and delete those matched records.

2. How can i delete the match files in remote location using ftp.
thks

Posted: Thu Nov 23, 2006 12:52 pm
by chulett
Better to store them in a database table... then pull them into a hashed file to use for checking in the actual process run.

Do you have the permissions you'd need to delete files from the ftp server? If so, you would just issue a del command inside the ftp session - your scripted ftp session.

The only way to do something like this using the dreaded FTP stage is via the before or after Telnet command options... and just because you can ftp to a server doesn't imply that you can also telnet to it.

Posted: Tue Nov 28, 2006 6:35 am
by sun786
1) ?
2) You have to write a remote shell script which will execute on the remote unix machine- Also u need to pass file-name parameters to this shell ........try

remsh servername -l userid 'test.sh'

Posted: Tue Nov 28, 2006 6:50 am
by DSguru2B
sun786....the OP happens to sit on a windowz box.