>> Right now you can get a static dump + image dump and produce a fully
>> working copy of a Wikimedia wiki. If all the maps relied on map tiles
>> hosted somewhere else that would break a lot of things for the static
>> dump content wise.
>
> I would have to learn more about how these static dumps are made.
> The Swedish Wikipedia (300,000 articles) contains no uploaded
> images.  It fetches all images from Wikimedia Commons that
> contains 4 million files.  Does a static dump of sv.wikipedia
> require a complete copy of Wikimedia Commons? Or can you get a

"complete copy of Wikimedia Commons" = several TB of data. When
wikimedia commons had 2 million files, total size of all files in it
was 2 TB. Now it have 4 million files, so I assume it is about 4 TB or
images. Not all of them are used in articles.

> separate dump of only the necessary images?  Could the same

http://meta.wikimedia.org/wiki/Wikix can do that

> selection work for map tiles?  We really don't want a tile
> generator to "upload" every tile through a MediaWiki.  :-)

there is slight problem - unlike images in wikipedia, map tiles are
frequently updated independently of wiki content. Get a planet dump
and you can generate all the tiles you need :)

> Since these static dumps were produced in June 2008, perhaps this
> system doesn't need to be our top priority.
>
> It is also possible, if the live maps for a city is at z=12, the
> static dump could show maps at z=8 thus saving lots of space by
> omitting the tiles for deeper zoom levels.  After all, a static
> dump is already a compromise.

I guess no need for tiles - every map can be replaced by single image
like this: 
http://tile.openstreetmap.org/cgi-bin/export?bbox=14.173,49.969,14.63,50.193&scale=270000&format=png
for the static wiki versions.

Few queries and we are done :)

Martin

_______________________________________________
talk mailing list
talk@openstreetmap.org
http://lists.openstreetmap.org/listinfo/talk

Reply via email to