On 2013-02-05 17:29, chris.an...@gmail.com wrote:
im trying to delete all text files from an ftp directory. is there a way to 
delete multiple files of the same extension?

I came up with the following code below which works but I have to append the 
string because ftp.nlst returns:

"-rwx------ 1 user group 0 Feb 04 15:57 New Text Document.txt"

but then when I try to delete it that long file name which includes the date doesnt exist - the 
files name is "new text document.txt" not "-rwx------ 1 user group 0 Feb 04 15:57 
New Text Document.txt"

so anyway I stripped off the the beginning keeping the last 21 characters and 
it worked great - this should work being that I know all my text files names 
are the same length in characters - but it seems like there should be a better 
more bullet proof way to do this?

[code]import os
import system
from ftplib import FTP

ftp = FTP('127.0.0.1')
ftp.login('')

directory = 'test'
ftp.cwd(directory)

files = ftp.nlst()

for file in files:
     if file.find(".txt") != -1:
         file = (file [-21:])
         ftp.delete(file)

ftp.close()[/code]

any ideas on this?  thank you.

Firstly, instead of:

   file.find(".txt") != -1

use:

    file.endswith(".txt")

It's clearer (and it's true only if the ".txt" is at the end!)

Secondly, your code assumes that the filename is exactly 21 characters.
It looks like the strings returned by ftp.nlst() consist of 9 fields
separated by whitespace, with the last field being the filename,
which can also contain spaces. That being so, you can split the strings
like this:

    fields = file.split(None, 9)

That'll make a maximum of 9 splits on any whitespace, for example:

>>> "-rwx------ 1 user group 0 Feb 04 15:57 New Text Document.txt".split(None, 9) ['-rwx------', '1', 'user', 'group', '0', 'Feb', '04', '15:57', 'New', 'Text Document.txt']

Therefore:

for entry in ftp.nlst():
    if entry.endswith(".txt"):
        filename = entry.split(None, 9)[-1]
        ftp.delete(filename)

--
http://mail.python.org/mailman/listinfo/python-list

Reply via email to