Hi,

I discovered and tested in Sugar the application "Openallure for
Learning" ( http://code.google.com/p/open-allure-ds/ and
http://openallureds.ning.com/ ), 
a "voice and vision enabled educational software" writen in Python.

As a Python application, it's multiplatform (Linux, Mac, Win). I got it
running in Ubuntu (after adding a few (22) dependencies) and on a XO 1.5
running Sugar 0.90.1 (after adding python-config-obj and python-nltk).

Thanks to its scriptable conception, it let's for example a teacher
design an interactive dialog (consisting a set of questions displayed on
screen and voice-synthetized per text-to-speech) and of answers made by
the student by pointing with the finger the line corresponding to the
selection, as captured by the camera.

It's certainly worth to experiment with the interactivity offered by the
Openallure application and I am sure that there is room for designing
powerfull extensions.

It worked here in Sugar (0.90) from the command line. I tried as well to
sugarize this activity (following the instructions in
http://wiki.sugarlabs.org/go/Running_Linux_Applications_Under_Sugar ),
however, I didnt manage to have a better user experience, compared to
the native application.

Kind regards,

Samy




_______________________________________________
Sugar-devel mailing list
Sugar-devel@lists.sugarlabs.org
http://lists.sugarlabs.org/listinfo/sugar-devel

Reply via email to