>> I?m using Wget on Windows Vista and it runs faster than other web
>> crawlers (like Httrack). >> >> Sometimes filename length exceeds the filesystem limit (in Vista this >> limit it is 260 characters included the folder name). >Are you sure it is INCLUDING the Path? >I have PATH + FELENAME much longer than 260 Characters. >It is Windows (Win7, Viasco, XP) which write it on my Samba Server I´m not sure, in any case for me the problem is the same, it fails trying to create files with a many parameters. >> Is there any option to use a hash function (SHA, MD5 ?) to code >> filenames in a shorter way? Is there any plan to implement it? >Use the "-O <filename>" option The problem of using this option is that I´ve to obtain first the list of links, obtain URL hash values and translate any URL reference in HTML pages. I suppose that this problem is be very common and solution is a simple hash function being aware of possible hash code conflicts. In fact httrack use this method as default URL encoding.
