We have a powerful testing framework now, and I would like to encourage
everybody here to use it.

To keep track of results and also to give credit for the effort required for
such tests, I have implemented a table of testing results in README.release
(revision 10814). So far there is only one column of results in that table
corresponding to default Linux configuration for CMake-2.8.1-rc3, but I soon
hope to fill in 5 additional columns in that table to show how both
CMake-2.6.4 and CMake-2.8.1-RC3 do for each of our major configurations (the
default configuration which is shared libraries/dynamic devices; shared
libraries/static devices; and static libraries/static devices).

The idea here is as the release approaches and we do further tests we will
only have to update the revision number in the column to indicate a later
test (and also the CMake-2.8.x version once 2.8.1 is released and used for a
test).  Also, as time goes and we deal with the issues found in these test,
it should be possible for individual testers to reduce the notes about
obvious errors that are found (e.g., the current note 4 about the -dev tk
issue).  Finally, README.release is obviously under svn control so we can
easily access old sets of test results to discover useful information such
as the last revision where the -dev tk driver worked for the install tree.

I have put in a category in the test table to describe the version of the
pango/cairo stack of libraries and similarly for Qt.  Leave those blank if
you are not testing the cairo and/or qt devices.  Of course, I strongly
encourage you to do such tests since those devices are available on all
platforms and give our absolutely best-looking results.

Arjen, I suggest you also fill in 6 columns in this table when your current
tests of the 3 windows platforms are done for CMake-2.6.x and
CMake-2.8.1-rc3.  Werner, I suggest you should also fill in a column
describing your recent Mac OS X test.  I am also hoping for test results for
the rest of our active core developers and also others here who would just
like to help out with PLplot testing.  For that latter group it would be
ideal to send me a patch to README.release which gives the results of your
tests, but if you just send me a column of test data in e-mail corresponding
to all the categories in the testing result table in README.release, I can
transcribe the data so you will get credit for your testing work.

Alan
__________________________
Alan W. Irwin

Astronomical research affiliation with Department of Physics and Astronomy,
University of Victoria (astrowww.phys.uvic.ca).

Programming affiliations with the FreeEOS equation-of-state implementation
for stellar interiors (freeeos.sf.net); PLplot scientific plotting software
package (plplot.org); the libLASi project (unifont.org/lasi); the Loads of
Linux Links project (loll.sf.net); and the Linux Brochure Project
(lbproject.sf.net).
__________________________

Linux-powered Science
__________________________

------------------------------------------------------------------------------
Download Intel® Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
_______________________________________________
Plplot-devel mailing list
Plplot-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/plplot-devel

Reply via email to