On Thu, Nov 20, 2008 at 11:53, Andy Michaels <[EMAIL PROTECTED]> wrote:
> Hi everyone.  Over the years, there has been much talk about network
> performance of various Soekris models.  I think it might be
> interesting to post some standardized results based on agreed-upon
> metrics and test cases.

Metrics & test cases will probably be easy to agree on; what
constitutes appropriate tuning on the candidate OS and whether to
allow those tunings will likely be harder.  I personally would like to
see either multiple untuned OSes or a single carefully-tuned one
generally accepted to have the fastest stack.  If this is to go on for
a while you could add the tunings at a later point, but as is usual
with such efforts, the first round is usually the only one that gets
done.

> On the software side, we'd need to agree on a set of OSes and configs
> as well as defining a test suite.  I can stub out some sample test
> cases on the wiki, but I'll need the community to complete the list.

Stub away.  Mousetrap and all.

> Also, please let me know if you think this is a worthy use of time and
> resources, or if you think it's a complete waste of time.

Benchmarking is generally less worthwhile since it seldom reflects
real-world scenarios.  Unless you have a huge amount of time on your
hands, I would suggest aiming at edge cases (worst & best scenarios)
that outline the performance envelope instead of trying to color it
in.
_______________________________________________
Soekris-tech mailing list
Soekris-tech@lists.soekris.com
http://lists.soekris.com/mailman/listinfo/soekris-tech

Reply via email to