On Fri, Jan 8, 2010 at 4:31 PM, Jamie Morken <jmor...@shaw.ca> wrote:
>
> Bittorrent is simply a more efficient method to distribute files, especially 
> if the much larger wikipedia image files were made available again.  The last 
> dump from english wikipedia including images is over 200GB but is 
> understandably not available for download. Even if there are only 10 people 
> per month who download these large files, bittorrent should be able to reduce 
> the bandwidth cost to wikipedia significantly.  Also I think that having 
> bittorrent setup for this would cost wikipedia a small amount, and may save 
> money in the long run, as well as encourage people to experiment with offline 
> encyclopedia usage etc.  To make people have to crawl wikipedia with Wikix if 
> they want to download the images is a bad solution, as it means that the 
> images are downloaded inefficiently.  Also one wikix user reported that his 
> download connection was cutoff by a wikipedia admin for "remote downloading".
>

The problem with BitTorrent is that it is unsuitable for rapidly
changing data sets, such as images. If you want to add a single file
to the torrent, the entire torrent hash changes, meaning that you end
up with separate peer pools for every different data set, although
they mostly contain the same files.

That said, it could of course be benificial for an initial dump
download and is better than the current situation where there is
nothing available at all.


Bryan

_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to