Re: [Sugar-devel] List of activities on ASLO

2016-03-14 Thread Chris Leonard
I think that is only present in the activity.info file.

One of my next steps is download every package and look at the
activity.info file for the bundle id as well as looking for signs of
i18n (e.g. a PO file).  That might take a while.

cjl

On Mon, Mar 14, 2016 at 11:20 AM, Sebastian Silva
 wrote:
> Guys, one thing that is missing here is the bundle_id (org.laptop.Terminal).
>
> This is supposed to be the unique identifier of each activity. Otherwise
> it will be very difficult and error prone to match, prune duplicates, etc.
>
> Regards,
> Sebastian
>
>
> El 14/03/16 a las 10:18, Chris Leonard escibió:
>> This time with attachment.
>>
>> On Mon, Mar 14, 2016 at 11:18 AM, Chris Leonard
>>  wrote:
>>> Thanks for the scripts Tony.  I filled in other fields with grep of
>>> the aslo# files.   I think scraper.py does some sorting that causes
>>> the collection list and the grep scrapes to not align properly (in
>>> part based on case-sensitive sorting).  It was laborious, it would be
>>> nice to have an improved version of such a data collection tool for
>>> periodic monitoring of activity status (po filename or latest version
>>> numuber and update date).
>>>
>>> I thought others might be interested in the results (so far).
>>>
>>> What I would love to add as columns on this spreadsheet are:
>>>
>>> ported to gtk3?
>>>
>>> set up for i18n?
>>>
>>> repo location?
>>>
>>> POT in Pootle.
>>>
>>> In going through this sheet, there are some apparent duplications of
>>> activities (possibly for single language support).
>>>
>>> cjl
>>>
>>>
>>> On Sat, Mar 12, 2016 at 12:57 AM, Tony Anderson  wrote:
 Hi, Chris

 I put together a process to do that some months ago. I can give you a
 working part of it which will give you two critical items:
 the title of the activity and the addon where it is found.

 First run the collector.py. This will access activities.sugarlabs.org and
 download six web pages giving 100 activities each (except the last), for a
 total
 of 567 activities. This will appear as aslo1, ..., aslo6. Next run
 scraper.py. This uses beautifulsoup to scrape the six web pages giving six
 collections,
 collection1, ..., collection6. Each line gives the addon and title of an
 activity. The scraper.py program does not access the network. You may need
 to install
 beautifulsoup to run the scraper.py program.

 The original collected more information from each activity with a goal of
 building a csv file that could be used to record information like the size
 of the
 activity, the most recent version, whether po is supported, whether gtk+3 
 is
 supported, whether it is a sugar-web or python activity, and whether it 
 uses
 gstreamer 0.1 or 1.0 and so on.

 Tony


 On 03/12/2016 10:22 AM, Chris Leonard wrote:
> Can someone with systematic access to ASLO do a data dump for me?  Any
> format will do (txt, csv, xls, ods, etc.)
>
> I am interested in reviewing all known activities (at least those in
> ASLO) for 1i8n/L10n and investigating further to see if we can
> implement i18n/L10n where it does not exist.  I would also like to
> check on presence of repo links. I know this was only recently
> requested, but I might as well check on it as I'll need it for
> i18n/L10n follow up.
>
> I've been updating:
>
>
> https://wiki.sugarlabs.org/go/Translation_Team/Pootle_Projects/Repositories
>
> data fields of interest
>
> Activity name
> Activity number
> Activity version (latest)
> Author(s) name
> Author(s) number
> Repo link (if available)
>
> Thanks in advance for any assistance.
>
> cjl
> ___
> Sugar-devel mailing list
> Sugar-devel@lists.sugarlabs.org
> http://lists.sugarlabs.org/listinfo/sugar-devel

>
>
___
Sugar-devel mailing list
Sugar-devel@lists.sugarlabs.org
http://lists.sugarlabs.org/listinfo/sugar-devel


Re: [Sugar-devel] List of activities on ASLO

2016-03-14 Thread Sebastian Silva
Guys, one thing that is missing here is the bundle_id (org.laptop.Terminal).

This is supposed to be the unique identifier of each activity. Otherwise
it will be very difficult and error prone to match, prune duplicates, etc.

Regards,
Sebastian


El 14/03/16 a las 10:18, Chris Leonard escibió:
> This time with attachment.
>
> On Mon, Mar 14, 2016 at 11:18 AM, Chris Leonard
>  wrote:
>> Thanks for the scripts Tony.  I filled in other fields with grep of
>> the aslo# files.   I think scraper.py does some sorting that causes
>> the collection list and the grep scrapes to not align properly (in
>> part based on case-sensitive sorting).  It was laborious, it would be
>> nice to have an improved version of such a data collection tool for
>> periodic monitoring of activity status (po filename or latest version
>> numuber and update date).
>>
>> I thought others might be interested in the results (so far).
>>
>> What I would love to add as columns on this spreadsheet are:
>>
>> ported to gtk3?
>>
>> set up for i18n?
>>
>> repo location?
>>
>> POT in Pootle.
>>
>> In going through this sheet, there are some apparent duplications of
>> activities (possibly for single language support).
>>
>> cjl
>>
>>
>> On Sat, Mar 12, 2016 at 12:57 AM, Tony Anderson  wrote:
>>> Hi, Chris
>>>
>>> I put together a process to do that some months ago. I can give you a
>>> working part of it which will give you two critical items:
>>> the title of the activity and the addon where it is found.
>>>
>>> First run the collector.py. This will access activities.sugarlabs.org and
>>> download six web pages giving 100 activities each (except the last), for a
>>> total
>>> of 567 activities. This will appear as aslo1, ..., aslo6. Next run
>>> scraper.py. This uses beautifulsoup to scrape the six web pages giving six
>>> collections,
>>> collection1, ..., collection6. Each line gives the addon and title of an
>>> activity. The scraper.py program does not access the network. You may need
>>> to install
>>> beautifulsoup to run the scraper.py program.
>>>
>>> The original collected more information from each activity with a goal of
>>> building a csv file that could be used to record information like the size
>>> of the
>>> activity, the most recent version, whether po is supported, whether gtk+3 is
>>> supported, whether it is a sugar-web or python activity, and whether it uses
>>> gstreamer 0.1 or 1.0 and so on.
>>>
>>> Tony
>>>
>>>
>>> On 03/12/2016 10:22 AM, Chris Leonard wrote:
 Can someone with systematic access to ASLO do a data dump for me?  Any
 format will do (txt, csv, xls, ods, etc.)

 I am interested in reviewing all known activities (at least those in
 ASLO) for 1i8n/L10n and investigating further to see if we can
 implement i18n/L10n where it does not exist.  I would also like to
 check on presence of repo links. I know this was only recently
 requested, but I might as well check on it as I'll need it for
 i18n/L10n follow up.

 I've been updating:


 https://wiki.sugarlabs.org/go/Translation_Team/Pootle_Projects/Repositories

 data fields of interest

 Activity name
 Activity number
 Activity version (latest)
 Author(s) name
 Author(s) number
 Repo link (if available)

 Thanks in advance for any assistance.

 cjl
 ___
 Sugar-devel mailing list
 Sugar-devel@lists.sugarlabs.org
 http://lists.sugarlabs.org/listinfo/sugar-devel
>>>




signature.asc
Description: OpenPGP digital signature
___
Sugar-devel mailing list
Sugar-devel@lists.sugarlabs.org
http://lists.sugarlabs.org/listinfo/sugar-devel


Re: [Sugar-devel] List of activities on ASLO

2016-03-14 Thread Chris Leonard
This time with attachment.

On Mon, Mar 14, 2016 at 11:18 AM, Chris Leonard
 wrote:
> Thanks for the scripts Tony.  I filled in other fields with grep of
> the aslo# files.   I think scraper.py does some sorting that causes
> the collection list and the grep scrapes to not align properly (in
> part based on case-sensitive sorting).  It was laborious, it would be
> nice to have an improved version of such a data collection tool for
> periodic monitoring of activity status (po filename or latest version
> numuber and update date).
>
> I thought others might be interested in the results (so far).
>
> What I would love to add as columns on this spreadsheet are:
>
> ported to gtk3?
>
> set up for i18n?
>
> repo location?
>
> POT in Pootle.
>
> In going through this sheet, there are some apparent duplications of
> activities (possibly for single language support).
>
> cjl
>
>
> On Sat, Mar 12, 2016 at 12:57 AM, Tony Anderson  wrote:
>> Hi, Chris
>>
>> I put together a process to do that some months ago. I can give you a
>> working part of it which will give you two critical items:
>> the title of the activity and the addon where it is found.
>>
>> First run the collector.py. This will access activities.sugarlabs.org and
>> download six web pages giving 100 activities each (except the last), for a
>> total
>> of 567 activities. This will appear as aslo1, ..., aslo6. Next run
>> scraper.py. This uses beautifulsoup to scrape the six web pages giving six
>> collections,
>> collection1, ..., collection6. Each line gives the addon and title of an
>> activity. The scraper.py program does not access the network. You may need
>> to install
>> beautifulsoup to run the scraper.py program.
>>
>> The original collected more information from each activity with a goal of
>> building a csv file that could be used to record information like the size
>> of the
>> activity, the most recent version, whether po is supported, whether gtk+3 is
>> supported, whether it is a sugar-web or python activity, and whether it uses
>> gstreamer 0.1 or 1.0 and so on.
>>
>> Tony
>>
>>
>> On 03/12/2016 10:22 AM, Chris Leonard wrote:
>>>
>>> Can someone with systematic access to ASLO do a data dump for me?  Any
>>> format will do (txt, csv, xls, ods, etc.)
>>>
>>> I am interested in reviewing all known activities (at least those in
>>> ASLO) for 1i8n/L10n and investigating further to see if we can
>>> implement i18n/L10n where it does not exist.  I would also like to
>>> check on presence of repo links. I know this was only recently
>>> requested, but I might as well check on it as I'll need it for
>>> i18n/L10n follow up.
>>>
>>> I've been updating:
>>>
>>>
>>> https://wiki.sugarlabs.org/go/Translation_Team/Pootle_Projects/Repositories
>>>
>>> data fields of interest
>>>
>>> Activity name
>>> Activity number
>>> Activity version (latest)
>>> Author(s) name
>>> Author(s) number
>>> Repo link (if available)
>>>
>>> Thanks in advance for any assistance.
>>>
>>> cjl
>>> ___
>>> Sugar-devel mailing list
>>> Sugar-devel@lists.sugarlabs.org
>>> http://lists.sugarlabs.org/listinfo/sugar-devel
>>
>>


ASLO.ods
Description: application/vnd.oasis.opendocument.spreadsheet
___
Sugar-devel mailing list
Sugar-devel@lists.sugarlabs.org
http://lists.sugarlabs.org/listinfo/sugar-devel


Re: [Sugar-devel] List of activities on ASLO

2016-03-14 Thread Chris Leonard
Thanks for the scripts Tony.  I filled in other fields with grep of
the aslo# files.   I think scraper.py does some sorting that causes
the collection list and the grep scrapes to not align properly (in
part based on case-sensitive sorting).  It was laborious, it would be
nice to have an improved version of such a data collection tool for
periodic monitoring of activity status (po filename or latest version
numuber and update date).

I thought others might be interested in the results (so far).

What I would love to add as columns on this spreadsheet are:

ported to gtk3?

set up for i18n?

repo location?

POT in Pootle.

In going through this sheet, there are some apparent duplications of
activities (possibly for single language support).

cjl


On Sat, Mar 12, 2016 at 12:57 AM, Tony Anderson  wrote:
> Hi, Chris
>
> I put together a process to do that some months ago. I can give you a
> working part of it which will give you two critical items:
> the title of the activity and the addon where it is found.
>
> First run the collector.py. This will access activities.sugarlabs.org and
> download six web pages giving 100 activities each (except the last), for a
> total
> of 567 activities. This will appear as aslo1, ..., aslo6. Next run
> scraper.py. This uses beautifulsoup to scrape the six web pages giving six
> collections,
> collection1, ..., collection6. Each line gives the addon and title of an
> activity. The scraper.py program does not access the network. You may need
> to install
> beautifulsoup to run the scraper.py program.
>
> The original collected more information from each activity with a goal of
> building a csv file that could be used to record information like the size
> of the
> activity, the most recent version, whether po is supported, whether gtk+3 is
> supported, whether it is a sugar-web or python activity, and whether it uses
> gstreamer 0.1 or 1.0 and so on.
>
> Tony
>
>
> On 03/12/2016 10:22 AM, Chris Leonard wrote:
>>
>> Can someone with systematic access to ASLO do a data dump for me?  Any
>> format will do (txt, csv, xls, ods, etc.)
>>
>> I am interested in reviewing all known activities (at least those in
>> ASLO) for 1i8n/L10n and investigating further to see if we can
>> implement i18n/L10n where it does not exist.  I would also like to
>> check on presence of repo links. I know this was only recently
>> requested, but I might as well check on it as I'll need it for
>> i18n/L10n follow up.
>>
>> I've been updating:
>>
>>
>> https://wiki.sugarlabs.org/go/Translation_Team/Pootle_Projects/Repositories
>>
>> data fields of interest
>>
>> Activity name
>> Activity number
>> Activity version (latest)
>> Author(s) name
>> Author(s) number
>> Repo link (if available)
>>
>> Thanks in advance for any assistance.
>>
>> cjl
>> ___
>> Sugar-devel mailing list
>> Sugar-devel@lists.sugarlabs.org
>> http://lists.sugarlabs.org/listinfo/sugar-devel
>
>
___
Sugar-devel mailing list
Sugar-devel@lists.sugarlabs.org
http://lists.sugarlabs.org/listinfo/sugar-devel


Re: [Sugar-devel] List of activities on ASLO

2016-03-12 Thread Sebastian Silva
Hi Tony,
Possibly hundreds of thousands of users have the Sugar Network
installed in their XO laptops. When, if, they possibly connect to the
Internet, thanks to it, we hear from them from time to time. We've
heard from 30K+ of them. It's really a project in a different scope
than the deployments you work with. We seeked to have the broadest
impact but now we are hoping to dig deeper by living in the rainforest
and founding what is, in practical terms, a school.

I'm not saying you need to reuse our code or even use the SN API.
I'm just saying web scrapping is complex when the simpler solution is
to dump the data itself from the database.

If you want shell access to this machine or a database dump, don't
hesitate to ask me or any other infrastructure team member (e.g. thru
systems at lists.sugarlabs.org team mailing list).

Regards,
Sebastian
Sebastian Silva
http://somosazucar.org/



2016-03-12 17:32 GMT+08:00 Tony Anderson :
> Hi, Sebastian
>
> I have tried to follow the Sugar Network project, but I have never gotten a
> good enough understanding to see its benefits. Unfortunately for a long time
> it was a shell without content. Then it appears to be bound to internet
> access which really makes it unusable in the deployements I work with. At
> one time you talked about a sneakernet approach to email which I think is a
> critical need - but nothing seemed to come of it.
>
> I don't understand why ASLO doesn't support Chris Leonard's requirement. The
> use of the addon number adds one more level of indirection, but it should
> be simple for ASLO to provide a url which returns an index of addons and
> activity names. What I did was to write python code that creates this
> missing list
> of addons, sugar activity names, and version numbers (the version number on
> the activity page which is often not the most current). The code needs some
> tweaking to deal with 'hidden' versions more recent than the one on the
> page. There is also a need to find the github location of the activity. One
> drawback of the migration to github is that the activities are under several
> different owners folders. According to the python code there are not 567
> activities (however,
> about 100 are the gcompris activities which Aleksey implemented some years
> back. Sadly, they no longer work. It is simpler to install the GCompris
> bundle in Gnome and use a wrapper to launch in Sugar which at least gives us
> access to the most recent versions of the various activities.
>
> Tony
>
>
>
>
> On 03/12/2016 05:08 PM, Sebastian Silva wrote:
>>
>> Hi Tony,
>> I find we often we do similar things with different approach. This is
>> always good as I think it confirms we have similar observations from the
>> field (from accross the globe!).
>>
>> From 2011 to 2014 Alsroot and I, developed the Sugar Network, with
>> UI/concept design and also resource planning from Laura.
>> The goal was to broaden access / replace the ASLO Library (among other,
>> more ambitious goals).
>> For this, among other things, Alsroot developed an API that would allow to
>> query (and feed!) the ASLO Library... eventually even replace it.
>>
>> The docs for the API are here:
>> http://wiki.sugarlabs.org/go/Sugar_Network/API
>>
>> For instance, it is possible to get the list of activities thus:
>> http://node.sugarlabs.org/context?type=activity&offset=0
>>
>> Change the offset to get all 470 entries, hard limit on server. I don't
>> think this is synchronizing with ASLO but it was designed to do so.
>>
>> We could not spark interest either with Sugar Labs or with any other
>> deployment, and eventually the Ministry of Education of Peru lost interest
>> in supporting the project. However I would be very happy if Alsroot's
>> obssesively polished work could be used for much more. In Peru this
>> continues to be used and deployed (and visible with the web frontend at
>> http://network.sugarlabs.org/ ). Laura continues to monitor and admin the
>> contents provided by the children, as well as monitor statistics. I have
>> stopped developing it because I see direct way to deploy a better user
>> experience (yet). However the dream of a Sugar Doers Network lives on.
>>
>> A short intro (spanish only) to the intended usage (with Sugar shell
>> integration) is here:
>> http://www.dailymotion.com/embed/video/xrapcp
>> (this is an early version 0.2 - last one was 0.9).
>>
>> A revisiting of these ideas should be a priority for Sugar Labs, IMHO.
>>
>> Regards,
>> Sebastian
>>
>>
>>
>> El 12/03/16 a las 00:57, Tony Anderson escibió:
>>>
>>> Hi, Chris
>>>
>>> I put together a process to do that some months ago. I can give you a
>>> working part of it which will give you two critical items:
>>> the title of the activity and the addon where it is found.
>>>
>>> First run the collector.py. This will access activities.sugarlabs.org and
>>> download six web pages giving 100 activities each (except the last), for a
>>> total
>>> of 567 activities. This will appear as aslo1, ..., a

Re: [Sugar-devel] List of activities on ASLO

2016-03-12 Thread Sebastian Silva

Hi Tony,
I find we often we do similar things with different approach. This is 
always good as I think it confirms we have similar observations from the 
field (from accross the globe!).


From 2011 to 2014 Alsroot and I, developed the Sugar Network, with 
UI/concept design and also resource planning from Laura.
The goal was to broaden access / replace the ASLO Library (among other, 
more ambitious goals).
For this, among other things, Alsroot developed an API that would allow 
to query (and feed!) the ASLO Library... eventually even replace it.


The docs for the API are here:
http://wiki.sugarlabs.org/go/Sugar_Network/API

For instance, it is possible to get the list of activities thus:
http://node.sugarlabs.org/context?type=activity&offset=0

Change the offset to get all 470 entries, hard limit on server. I don't 
think this is synchronizing with ASLO but it was designed to do so.


We could not spark interest either with Sugar Labs or with any other 
deployment, and eventually the Ministry of Education of Peru lost 
interest in supporting the project. However I would be very happy if 
Alsroot's obssesively polished work could be used for much more. In Peru 
this continues to be used and deployed (and visible with the web 
frontend at http://network.sugarlabs.org/ ). Laura continues to monitor 
and admin the contents provided by the children, as well as monitor 
statistics. I have stopped developing it because I see direct way to 
deploy a better user experience (yet). However the dream of a Sugar 
Doers Network lives on.


A short intro (spanish only) to the intended usage (with Sugar shell 
integration) is here:

http://www.dailymotion.com/embed/video/xrapcp
(this is an early version 0.2 - last one was 0.9).

A revisiting of these ideas should be a priority for Sugar Labs, IMHO.

Regards,
Sebastian



El 12/03/16 a las 00:57, Tony Anderson escibió:

Hi, Chris

I put together a process to do that some months ago. I can give you a 
working part of it which will give you two critical items:

the title of the activity and the addon where it is found.

First run the collector.py. This will access activities.sugarlabs.org 
and download six web pages giving 100 activities each (except the 
last), for a total
of 567 activities. This will appear as aslo1, ..., aslo6. Next run 
scraper.py. This uses beautifulsoup to scrape the six web pages giving 
six collections,
collection1, ..., collection6. Each line gives the addon and title of 
an activity. The scraper.py program does not access the network. You 
may need to install

beautifulsoup to run the scraper.py program.

The original collected more information from each activity with a goal 
of building a csv file that could be used to record information like 
the size of the
activity, the most recent version, whether po is supported, whether 
gtk+3 is supported, whether it is a sugar-web or python activity, and 
whether it uses

gstreamer 0.1 or 1.0 and so on.

Tony

On 03/12/2016 10:22 AM, Chris Leonard wrote:

Can someone with systematic access to ASLO do a data dump for me?  Any
format will do (txt, csv, xls, ods, etc.)

I am interested in reviewing all known activities (at least those in
ASLO) for 1i8n/L10n and investigating further to see if we can
implement i18n/L10n where it does not exist.  I would also like to
check on presence of repo links. I know this was only recently
requested, but I might as well check on it as I'll need it for
i18n/L10n follow up.

I've been updating:

https://wiki.sugarlabs.org/go/Translation_Team/Pootle_Projects/Repositories 



data fields of interest

Activity name
Activity number
Activity version (latest)
Author(s) name
Author(s) number
Repo link (if available)

Thanks in advance for any assistance.

cjl
___
Sugar-devel mailing list
Sugar-devel@lists.sugarlabs.org
http://lists.sugarlabs.org/listinfo/sugar-devel




___
Sugar-devel mailing list
Sugar-devel@lists.sugarlabs.org
http://lists.sugarlabs.org/listinfo/sugar-devel