That is a lot of criterions that  you list here!
I agree with most of them, but I dislike a few, namely "speed" and
"optimizations".
I believe that just knowing whether it is interpreted or compiled (to
C, machine code, whatever) is "enough information" for getting a rough
notion of speed to refine usability of the implementation.
I also miss (unless I misread) the age of the implementation. Ikarus
is newer than Gambit, and this deserves working as a weight in some
computation (that may decrease as time goes by, to reflect the
reactivity to implement features).

Now, do you want to put all scores together in a dish, then the oven
and see the result later [1]?
Or did you think or a multidimensional rank comparing only A's with
A's, B's with B's, and so on?
We must remember that some implementations are specific for a task and
do them extremely well: you would not want to penalize them because
they don't offer OpenGl databases over a relational pigeon debugger,
would you?

I think that you sketched some nice comparison criterions [2] and that
by separating them in groups you did very well.
Thus, based on your suggestions, I think that we could apply some
objective score for each group.

*Then*, we may proceed to the ranking troll by using this numbers to
build (or not) some subjectively biased value.
Also, ranking does not have to be absolute. A preference system
allowing A>B, B>C, C>A may be good too.


Cheers,

P!

[1] By the way, the cake is a lie.
[2] Comparing Scheme implementations should *not* be seen as
blasphemy; do we all agree on that at least?

2009/9/2 Vincent Manis <[email protected]>:
> I was thinking about what metrics one *might* use on Scheme
> implementations.
> This was happening during my fitness class this morning, so we're not
> exactly
> talking profound thinking. Nonetheless, I came up with the following
> list.
>
> A. Academic contribution
>    - number of scholarly publications in journals and conferences
>    - number of technical reports and other unrefereed publications
>    - other systems (Scheme or not) that use ideas introduced in this
>      implementation
>
> B. Scope
>    - conformance to standards (RnRS)
>    - number of useful[1] SRFIs implemented
>    - support for internationalization
>    - additional useful features
>    - support for an object system
>
> C. Development facilities
>    - presence of an IDE[2]
>    - editor integration
>    - quality of error reports
>    - presence and quality of debugger and profiler
>
> D. Platform support
>    - FFI powerful enough to support all of (say) OpenGL and a
> relational
>      DBMS
>    - access to all OS, GUI, etc. facilities without having to write
> custom
>      code
>    - threads, either provided by the implementation or native
>    - ability to generate standalone executables
>
> E. Implementation quality
>    - speed on a standardized benchmark[3]
>    - number of optimizations performed during compilation, and the
> effectiveness
>      of each
>    - memory usage on the same benchmark
>    - ability to pass one or more test suites regarding the items in B
> above
>    - number of platforms supported
>    - ability to run in an embedded environment, especially one with
> limited
>      resources
>    - ability to run for extended periods of time without degradation,
> e.g.,
>      as a server
>    - for implementations whose core is written in C: ability to pass
> a static
>      checker such as PC-Lint, and a runtime analysis tool such as
> Purify;
>      for implementations written in Java or C#, ability to pass
> similar tools;
>      implementations written in Scheme get a pass on this one :)
>    - number of external dependencies, ease of building from source
>    - overall design and coding practices; ease of adding new
> primitives (not
>      necessarily the same as FFI)
>
> F. Usability
>    - documentation quality; including organization, completeness,
> examples, etc.
>    - ease of installation (not quite the same as `building from
> source' above)
>    - release policy; attention paid to backwards compatibility in new
> releases
>
> G. Social issues
>    - availability of application libraries
>    - size and general activity of user community
>
> I have no doubt omitted your favorite criterion, and for that you are
> of course
> free to flame me. However, my point is really the obvious one, namely
> that no
> single figure of merit makes any sense. One person might reject an
> implementation
> because it runs too slowly, while another might love it because of a
> very powerful
> debugger (I'm not saying those two features are at all opposed, merely
> that it is
> possible to imagine implementations that have one but not both
> features). Similarly,
> an implementation might have been very influential in the past, but
> now moribund.
> Another implementation might add absolutely nothing new, but still be
> extremely
> well-engineered, and hence efficient and great to use.
>
> Notes:
>   [1] No TCP/IP stack can be complete without an implementation of
> RFC 1149, the
>       carrier pigeon protocol.
>   [2] I'd probably regard the availability of an IDE as a negative.
>   [3] Of course we'd have to AGREE on a standard benchmark first!
>
> -- v
>
>
>
>
>
>
>
> _______________________________________________
> r6rs-discuss mailing list
> [email protected]
> http://lists.r6rs.org/cgi-bin/mailman/listinfo/r6rs-discuss
>



-- 
Français, English, 日本語, 한국어

_______________________________________________
r6rs-discuss mailing list
[email protected]
http://lists.r6rs.org/cgi-bin/mailman/listinfo/r6rs-discuss

Reply via email to