On 04/ 2/16 11:27 PM, Robert Brockway wrote:
<snip>
One of the main issues is that to open such files, a popen (instead of fopen)
was necessary, so this littered the code with checks based on if the file was
compressed, it had to record if the file was compressed (so when it saved it,
it saved it as compressed), and also dealt with the potential of of many
different compression methods (compress, gzip, bzip, now xzip, etc).  It was
removed to make the code cleaner, which is a good thing, and at the time, the
given size of maps (and other data) wasn't large enough to be a concern.

Also, at the time, it allowed all files to be compressed (archetypes, player
files, etc).  Certainly allowing it only on map files would be limit the
number of places that code would be needed.  Other assumptions could be made,
like if compress is set in configure, then assume all map files to be opened
will be compressed, and all map files saved will be compressed (thus, do not
need to record at time of reading if the file was compressed and what it was
compressed with).

What I was thinking about was quite a bit simpler.  Try to open the uncompressed
map file as normal.  If that fails try to open the same file in the same
directory with extension .xz compressed with the xz algorithm (or subsitute
another similar compression algorithm to taste) while keeping all temp files
uncompressed.

That is more or less what the old code did - however, there was the potential of several possible compression method used, so it would have to try .xz, .gz, .Z, etc.

With availability of compression libraries, if this was to be redone, that logic might be different - instead of calling popen, if the server found a .xz suffix, it runs it through the xz decoder. However, it still has to record that the file was in fact compressed when it read it in, so it needs to write it out compressed.

Doing it with a library is certainly better - I have to imagine the overhead of running an external program to do the compression could not have been great.


The editor would need to know about that too of course.

That might have been another reason the old compression logic was removed - possibly the java editor did not support it. Back with the old C editor, it used the same function to read the maps as the the server, so if the compression was supported, crossedit would also support it. That certainly does not exist now.


Fair point, but you are sort of an edge case on this, which is to say, a
feature really only one person would probably use.

If 1000x1000 maps became standard in the game (or at least a supported ad-on) it
could be common.

True, for certain cases (eg, those where the cost of 100 GB of storage is significant because of hosting costs). Or I guess if someone is running all on an SSD, it may not be particularly large, so chewing up 100 GB of it with map data may not be ideal.


I wonder if it is possible to do it with a plugin using Mapload or Mapenter:

http://wiki.cross-fire.org/dokuwiki/doku.php/server_plugin?s[]=events#hooking_to_global_events


If so the uncompressed map could be cleaned up by Mapunload or Mapreset.

Maybe - it is possible that these could be done outside, eg, before the map is loaded, it decompresses the map file so that the uncompressed map file exists. However, I'm not exactly sure the timing of those functions - for MapLoad, it would have to be done before any work on loading the map is done, and I suspect (though could be wrong and am too lazy to look at the code now), that event may be generated after the map is loaded but before anything is done with it (thus scripts could make certain changes to the map or do other special initialization). Likewise, the MapUnload may be done before the map is unloaded, so global events could do certain cleanup.

However, I would expect performance to be worse in this case, even if possible, as basically it would have to call an external program to decompress the map into a new file, and then the server reads in that file. That triples the I/O - now granted, it probably all remains in the file cache, but certainly less efficient than just decompressing it as it is read in.


Back to the mega map, one thing that was thought of back in the past was to
unify the scale.

When I was considering what to do with my new world a few months ago I went
through the archives and found a discussion on this topic.  I felt there were
some good argument against having buildings at the same scale as the bigworld
map.  In particular there was a concern about the ability of characters to see
the buildings properly.  This seemed like a strong argument to me.

It would certainly make relations of buildings harder. However, in some ways, it would also make buildings easier to find - on such a huge world, it would be pretty easy to miss a castle in the now much larger forest, etc. On a single scale, you'd probably find the castle wall - may have to circle around to find the entrance, etc.

However, such a change would be pretty massive, simply because many maps in town generate monsters - one could pretty quickly find those wandering the town, perhaps also multiplying to a large degree.

Another bigger issue would likely be map resets - even if no one enters the church in town, they may wander close enough so it doesn't reset as often as it should


_______________________________________________
crossfire mailing list
crossfire@metalforge.org
http://mailman.metalforge.org/mailman/listinfo/crossfire

Reply via email to