The problem with drawing a bright line is that somebody is inevitably
left on the other side. Many working groups have drawn a standard called
C99; we do not have to rigidly adhere to it, but instead of requiring
specific versions of a specific toolchain, we should write portable code
reasonably within a particular standard.

  And to follow that, if there are features of a specific version of the
language that would be useful, say the requirement is 'the compile you use must
support foo.  foo is known to be supported in gcc x, visual studio y, ..  If
your compiler is not listed, see if that option is supported if additional
compiler flags are needed'

Exactly. My point is that the code MUST be compilable by a set of specific
toolchains by following a specific standard (e.g., C99). Some C99 features
will probably not work on all specified toolchains, thus, those must be
omitted in the code. I see no need for checking for a specific compiler.
Instead, I'd just assume it.


  That in many ways works better - oftentimes, compilers will lack full support
for certain options, but support the ones we care about.

I agree.

2. With the platform requirements above given, C99 and/or C++11 can
be assumed. Even if we decide not to use any C++ at all, I would
suggest compile the code with a C++ compiler for reasons of
portability.

I've seen recommendations to compile C using a C++ compiler. However, if
you refer to Bjarne Stroustrup's authoritative book he admits that
certain incompatibilities exist. C++ is no more standard than C, and C
is just as (maybe even more) portable as C++.

No. I'm thinking of the GCC project -- they are doing it this way. The
incompatibilities are really restricted to struct initialization and
using C++ keywords as variables or functions (like new or class).
BTW: that book is neither a good reference, nor a good C++ tutorial, IMHO.

  The other problem I think that can lead to is this - suppose some change is
made that works fine when compiled in C mode but fails in C++ mode for whatever
reason - you now get the problem of whether the developer making the change will
actually care about that, and depending on where that incompatibility is,
whether they can actually figure it out if they are a pure C programmer.

I had already compiled the crossfire 1.70 code with C++ using GCC last year. I can't rembember any serious issues.

  If anything, for full compatibility, compiling with different compilers with
full warnings/strict mode may be better.

I tend to agree, though, I have had some bad experience in another project with
a lot of false-positives using VS 2008. Using GCC, I would always recommend
compiling with -W -Wall.

3. With defined platform and compilers, cleanup and janitorial work
can start. This includes, e.g., the use of standard types (like bool
or uint32_t), standard functions (like calloc), removal of various
autoconf checks, etc.

I'm in favor of doing this in the mid-term. We already have a nice
collection of compatability macros that can serve as a crutch for
compilers we do not obey C99.

  And that can certainly be extended.  The addition of functions like snprintf
are worth supporting (as are strlcat and strlcpy if those are part of some

TBH I have some problems with strlcat and strlcpy: they are neither supported
by ISO C, nor by POSIX. The glibc project repeatedly refused to add those
functions to the standard library. They offer no more functionality than
strncpy and strncat. Conversely, they are a known source of potential bugs
(a quick Google search for strlcpy and glibc will reveal the details).

standard), but those can also be easily checked for in autoconf, and if they
fail to exist, some simple conditionals can check for that and private functions
added.  Same for fixed sized types - the native types used by the compiler can
be used instead of the typedefs currently in place, but if those native types
are not available (due to old version), a simple enough ifdef to use the typdef
instead.

I guess, we're here in disagreement. If I understood you correctly, you want to
support old compilers, while my approach is to remove old cruft to make
crossfire maintainable. What old toolchains are being used and which of them do
you want to support and test ?

4. Modernize architecture and replace existing components.

I'm not exactly sure what this means. I also see no point in replacing
components that have been in service and aren't breaking. I see no harm
in rewriting code, but it'd be a lot more productive to focus on making
the game more fun than fixing what isn't broken.

  I'd note that a lot of the goofy, ugly, or odd code exists because maps expect
it that way.  Which is to say, some functions could potentially be made cleaner
and simpler, but to do so would require examining every map and making changes
to some number (and depending on exactly what construct is being used, being
able to detect those automatically might be hard).

I had in mind the networking, event processing/server loop, and plugin codes.

  I'm all for fixing some of that, but it falls into the category of a lot of
work with no direct/end user effect.  For programmers, there is cleaner code,
but for players, things worked (or should work) exactly the same as before.  So
those types of changes tend to be somewhat low priority just for that reason.

I agree, though, my contribution will certainly be in the coding area. I leave
the game play and game content creation to the experts :)

_______________________________________________
crossfire mailing list
crossfire@metalforge.org
http://mailman.metalforge.org/mailman/listinfo/crossfire

Reply via email to