On Tue, 27 Sep 2005 16:52:19 +0200 Haxe wrote: > GTKG seems to always store downloaded files on disk using utf-8 to > encode the filename. Instead, it should use the charset/encoding given > by the locale. This _could_ be utf-8, but in my case ([EMAIL PROTECTED]) it > isn't.
An ideal way to handle this would be separated treatment in which actual filename is kept in local encoding as is while returned on query hit/ uploading filename is converted to UTF-8, but underlying concept on gnutella is that once downloaded file is assumed to be shared and non- ASCII filename doesn't always have an user's local encoding, this can be problem i.e. if you download file has non ISO-8859-15 filename, it must be simply unreadable at local encoding conversion and once illegally converted filename will be hard to re-convert to UTF-8 if the above conversion works. Although, to be more specific, I'm not sure whether a German person try to download Japanese file, so it might be practical way to bind a file name with local encoding, but I've been seeing Japanese filename at 'us', 'ca', 'au' sometimes 'gb' even 'de' servent. Of course I can completely understand your annoying, imagine what's happened on UTF-8 Japanese filenames in EUC-JP, seeing more than bogus umlauts/eszet is simply mad. Back to the first topic, what do you think about ideal handling? Probably 'Convert filename encoding to' option and 'Force disable partial file sharing' may be required, probably not, no really not. For the workaround, convmv --notest -f UTF-8 -t ISO-8859-18 files will be workable for you. Regards, -- Daichi ------------------------------------------------------- This SF.Net email is sponsored by: Power Architecture Resource Center: Free content, downloads, discussions, and more. http://solutions.newsforge.com/ibmarch.tmpl _______________________________________________ Gtk-gnutella-devel mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/gtk-gnutella-devel
