Dear Stefan and Andrea,
The evaluation method was included in our training - both within CASCADOSS 
regional workshops and summer courses organized by SADL for the participants 
from developing countries. It aims at assessing ordinary - let's say 
"middle-level" users to find an Open Source alternative to proprietary tools 
they use. Our experiences proved that in case of technical potential that 
"middle-level" should be a bit higher, as both finding the detailed technical 
features of the software in the documentation or by hands-on is far more 
difficult.

As for the evaluation we performed in CASCADOSS. 
>From my experience in FOSS4G documentation and evaluation - some of the 
>projects were really poorly documented and sometimes looking for information 
>required quite extensive research. What we could not find, we tried to figure 
>out by hands-on experience or, in case of very incomplete documentation - 
>direct communication with project leaders or communities. 

In CASCADOSS we put a huge effort to document and evaluate the software in 
aspect of applications in environmental domain aiming at further teaching of 
environmental NGOs, administration and companies what are strengths and 
weaknesses and providing them with method of finding the tool they need.  None 
of the software tools was preferred - we had a general rule to remain 
objective. There was not much space for subjective interpretation anyway - the 
evaluation was based upon documentation of the SW project and hands-on 
experience, as well as secondary opinions or benchmarks. We all have put equal 
pressure on each of the software projects we evaluated, to assure the true 
comparability between them. IMO contacting the communities asking for a review 
of the result was in contradiction to the goal of the evaluation as designed to 
being performed by ordinary GIS users based on available (documentation) or 
discoverable (hands-on) information. On the other hand sending the evaluation 
results to projects' leaders or communities would certainly raised awareness 
about the method and results in the FOSS4G community... 

I exercised the evaluation with the students of our Summer School on FOSS4G in 
2008 and 2009 editions and the results for Quantum GIS throughout 2 years 
improved, but remained consistent with the previous evaluation from late 2007.  
We tested only technical potential, but I think during last 2 years the 
documentations of FOSS4G improved a lot, so also remaining two potentials will 
get higher scores for some of the projects. There are new standards addressing 
spatial web services appearing on the market, hence also the list of web 
mapping functions and desktop clients' features should certainly be extended.

The project was financed by the 6th Framework programme of the EU and only 
thanks' to this financing we were able to spend so much time and resources and 
eveluate so broad range of software projects. The project ended in April 2009 
and I guess much criticism may be drawn about financed projects here - what is 
happening with the deliverables after project's end? Maybe we did not put 
enough effort to disseminate the results: one paper on 4th gvSIG Conference, 
few posters on conferences and 1 paper to be published in vol. 6th of the OSGeo 
Journal - a short report from CASCADOSS International Workshop and Symposium 
held in June 2008 under the patronage of OSGeo Foundation. I bet other 
CASCADOSS partners can add much to the disseminaiton history.
I guess we rather focused on reaching end users in our regions to raise the 
awareness of the FOSS4G. CASCADOSS is currently continued by one of CASCADOSS 
partners - Adam Podolcsak and his company Compet-Terra who was very active in 
promoting the Open Source in Hungary. He initiated the CASSCADOSS Association 
and continues to produce the CASCADOSS LiveDVD, which we use in our trainings 
on FOSS4G and there is a possibility it will used in some testbeds in the 
future. We are looking for the opportunity for a follow-up project - I think 
there is no better place as OSGeo to create a consortium for that (-: but we d 
not see the opportunity within the calls that appear at the moment.

You can find more information on the CASCADOSS international website: 
www.cascadoss.eu, the website of CASCADOSS Association at Compet_Terra: 
http://cascadoss.competterra.com/cascadoss.php?home_en and upcoming paper 
containing a short report on CASCADOSS international events held in June 2008 
in Warsaw: discussions on license issues, business models, GMES as well as a 
SWAT analysis guided by Markus Neteler and Mateusz Loskot - Guys, again many 
thank's for your extremely valuable contribution in Warsaw!

I must say, that I got several mails asking why the particular software project 
got lower score. Noone from high ranks complained (-; In case of GeoServer I 
guess there is an eternal race to be the best in the market.(-;
I encourage you to make an evaluation using CASCADOSS form so you can compare 
the current versions of the two leading web map servers. I think making the 
evaluation yourself not only viewing the scores, could be profitable not only 
for us - to know you opinion and improve the method, but for you as well - to 
maybe find missing elements in your software project or gaps in its 
documentation. (-:

Best regards:
Raf

Dr. Rafal Wawer
K.U.Leuven
R&D Division SADL (Spatial Application Division)
Celestijnenlaan 200e bus 2224
BE-3001 Leuven-Heverlee
Belgium
tel. 0032 16 329731








-----Original Message-----
From: discuss-boun...@lists.osgeo.org [mailto:discuss-boun...@lists.osgeo.org] 
On Behalf Of Mateusz Loskot
Sent: 18 February 2010 21:31
To: OSGeo Discussions
Subject: Re: [OSGeo-Discuss] Comparison

Stefan Steiniger wrote:
> same for me - I stumbled over this page after a hint from Markus - and 
> nobody ever asked us to have a look over it or send an email.

Stefan,

Please try to understand what Rafal says.
It was intentional and it was part of the idea of the evaluation made in frame 
of the CASCADOSS project.

There is no point in making outraged debate if this idea was right or not, as 
it would be a never-ending story.
It would be better to consider it as a software benchmark.
It is not possible to create a benchmark that is completely and fully objective 
and makes happiness and peace in every corner of all geeky mindship.
Every study makes particular assumptions, so CASCADOSS study does.

The point is to review current status and make it better.

Best regards,
--
Mateusz Loskot
http://mateusz.loskot.net

_______________________________________________
Discuss mailing list
Discuss@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/discuss
_______________________________________________
Discuss mailing list
Discuss@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/discuss

Reply via email to