GSoC 2008 Voice box

2008-03-29 Thread Frederik Sdun
Hi,

I had some problems mailing to the list. So i try to take my last
chance.

I'm a student in the german university of applied science in Offenburg
and want to participate openmoko for Google Summer of Code and more.

I'm very interested in a feature on the wishlist: the voicebox [1]. i
think this is a great feature for the users because they can see and use
it from the beginning. Also thought about calendar integration, so user
do not have to switch to this mode. instead the phone replies sth. like
"I'm in a meeting till 2.30 p.m. call you later" ([2]) or sends a
message.

If you will prefer one of the official list i want to code one of the
VoIP parts (9,10,16). This is a great feature if you are at home or a
customer who has a contract with for a lot of hotspots. might be not
useful for umts yet because most providers block these ports.

hope i can join the community.

Regards 
Frederik Sdun 


[1]http://wiki.openmoko.org/wiki/Wish_List#Voice_Mailbox
[2]http://projects.openmoko.org/projects/mokotts/


___
Openmoko community mailing list
community@lists.openmoko.org
http://lists.openmoko.org/mailman/listinfo/community


Re: GSoC 2008

2008-03-28 Thread D. R. Newman

Michael 'Mickey' Lauer wrote:

Hi guys,

as you may have already noticed, Openmoko Inc. has been accepted
as a mentoring organization for the Google Summer of Code 2008.



Please note that the list of ideas found on the wiki page is by no means
comprehensive, it's rather a bunch of things we think would be cool. If
you come up with even cooler stuff, be our guest :)


I have a proposal that I put in for funding from a research council in 
the UK. It was turned down by them, but might make an interesting 
project for the Google Summer of Code.


The challenge is: "Can we customise and internationalise the OpenMoko 
interface so that it can be used by farmers in villages in Bangladesh?".


What they need from a mobile 'phone is not the same as urban businessmen 
and teenagers want. Yet at present you cannot even send SMS messages in 
Bengali in Bangladesh. All the handsets require you to write in English 
(or at least Latin characters).


Imagine a 'phone that can easily be switched between character sets 
(using the GTK internationalisation tools), and also has icons and 
interaction modes that make sense to semi-literate farmers whose homes 
look nothing like a conventional home icon.


I have a Ph.D. student coming back from fieldwork in Bangladesh next 
month. He has been studying how groups of farmers use mobile 'phones 
(e.g. do they do it to find pricing information, cutting out middlemen) 
and what difficulties they are having learning to use to devices. So he 
will be able to set the practical requirements.


The technical context is of internationalisation, and usability design.

--
Dr. David R. Newman, Queen's University Belfast, School of Management
and Economics, BELFAST BT7 1NN, Northern Ireland (UK)
Tel. +44 28 9097 3643  mailto:[EMAIL PROTECTED]
http://www.qub.ac.uk/mgt/ http://www.e-consultation.org/

___
Openmoko community mailing list
community@lists.openmoko.org
http://lists.openmoko.org/mailman/listinfo/community


RE: GSoC 2008

2008-03-26 Thread Crane, Matthew

> But as we only can choice one of them for this application, you should
> be prepared for other applications, too.

Yea, whatever API into the accelerometer is made should return some form
of condensed data but not necessarily be tied to the idea of gestures.  

Different applications may want the data at the same time.  

E.g. background car crash detector (that dials out and knows when it's
on the road) working concurrently with gestures for answering phone etc.

Maybe the sort of thing where an api would allow an app to register a
set of gesture, defined mathematically, and only one system process
polls for matching events.  Doesn't sound like multiple processes
polling the data, or even processing the gestures, would work as nicely.

Matt


-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Stefan
Schmidt
Sent: Tuesday, March 25, 2008 10:36 AM
To: community@lists.openmoko.org
Subject: Re: GSoC 2008

Hello.

On Mon, 2008-03-24 at 19:02, Niluge KiWi wrote:
> 
> I'm interested in the accelerometers features [1]: Recognising
gestures
> is a really important part of the interface between the user and the
> phone.

Seems this ideas gets the interest of a lot people. Nice. :)


> With the two accelerometers in the FreeRunner, I think we can
recognise
> lots of gestures, not only simple ones like a "click" (which is
already
> recognised by the accelerometers used in the FreeRunner). The main
> difficulty is probably to extract the useful data from the gestures
> noise : calibration may take time. The goal is to have an almost
> pre-calibrated library (an idea from the wish-list in the Wiki is to
> allow the user to record its own gestures, but I think it's not easy
to
> do it simple for the end-user).

Letting the user add new gestures is a key feature IMHO. Also letting
them combine different gestures to new ones. We should make it easy
for people beaing creative with this. That's where innovation can
start. :)

If we can have a preset of already known gestures shipped with the
device, great.

> I'm also interested in working in the ambient noise detection in
second
> choice.

Also interesting. What I never understand completely is what kind of
cool stuff we can do with this. I mean detecting the ambient volume
level and adjust the ringing, etc is nice, but can we do more with it?
Fancy things like detect if we are in a car or plane and react
accordingly?

regards
Stefan Schmidt

___
Openmoko community mailing list
community@lists.openmoko.org
http://lists.openmoko.org/mailman/listinfo/community


Re: GSoC 2008

2008-03-26 Thread Federico Lorenzi
On Tue, Mar 25, 2008 at 4:36 PM, Stefan Schmidt <[EMAIL PROTECTED]> wrote:
> Hello.
>
>
>  On Mon, 2008-03-24 at 19:02, Niluge KiWi wrote:
>  >
>  > I'm interested in the accelerometers features [1]: Recognising gestures
>  > is a really important part of the interface between the user and the
>  > phone.
>
>  Seems this ideas gets the interest of a lot people. Nice. :)
>
>  But as we only can choice one of them for this application, you should
>  be prepared for other applications, too.
>
>
>  > With the two accelerometers in the FreeRunner, I think we can recognise
>  > lots of gestures, not only simple ones like a "click" (which is already
>  > recognised by the accelerometers used in the FreeRunner). The main
>  > difficulty is probably to extract the useful data from the gestures
>  > noise : calibration may take time. The goal is to have an almost
>  > pre-calibrated library (an idea from the wish-list in the Wiki is to
>  > allow the user to record its own gestures, but I think it's not easy to
>  > do it simple for the end-user).
>
>  Letting the user add new gestures is a key feature IMHO. Also letting
>  them combine different gestures to new ones. We should make it easy
>  for people beaing creative with this. That's where innovation can
>  start. :)
>
>  If we can have a preset of already known gestures shipped with the
>  device, great.
>
>
>  > I'm also interested in working in the ambient noise detection in second
>  > choice.
>
>  Also interesting. What I never understand completely is what kind of
>  cool stuff we can do with this. I mean detecting the ambient volume
>  level and adjust the ringing, etc is nice, but can we do more with it?
>  Fancy things like detect if we are in a car or plane and react
>  accordingly?
Maybe the GPS would be better suited to that...
Speed below 30km/h = walking / running
Speed above 40km/h and below 240km/h = driving
Speed above 600km/h = plane.

Naturally however there should be an option to override this :)
Cheers,
Federico

___
Openmoko community mailing list
community@lists.openmoko.org
http://lists.openmoko.org/mailman/listinfo/community


Re: GSoC 2008

2008-03-26 Thread Stefan Schmidt
Hello.

On Mon, 2008-03-24 at 19:02, Niluge KiWi wrote:
> 
> I'm interested in the accelerometers features [1]: Recognising gestures
> is a really important part of the interface between the user and the
> phone.

Seems this ideas gets the interest of a lot people. Nice. :)

But as we only can choice one of them for this application, you should
be prepared for other applications, too.

> With the two accelerometers in the FreeRunner, I think we can recognise
> lots of gestures, not only simple ones like a "click" (which is already
> recognised by the accelerometers used in the FreeRunner). The main
> difficulty is probably to extract the useful data from the gestures
> noise : calibration may take time. The goal is to have an almost
> pre-calibrated library (an idea from the wish-list in the Wiki is to
> allow the user to record its own gestures, but I think it's not easy to
> do it simple for the end-user).

Letting the user add new gestures is a key feature IMHO. Also letting
them combine different gestures to new ones. We should make it easy
for people beaing creative with this. That's where innovation can
start. :)

If we can have a preset of already known gestures shipped with the
device, great.

> I'm also interested in working in the ambient noise detection in second
> choice.

Also interesting. What I never understand completely is what kind of
cool stuff we can do with this. I mean detecting the ambient volume
level and adjust the ringing, etc is nice, but can we do more with it?
Fancy things like detect if we are in a car or plane and react
accordingly?

regards
Stefan Schmidt


signature.asc
Description: Digital signature
___
Openmoko community mailing list
community@lists.openmoko.org
http://lists.openmoko.org/mailman/listinfo/community


Re: GSoC 2008

2008-03-25 Thread Andy Green
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Somebody in the thread at some point said:
> Somebody in the thread at some point said:
>> The raw accelerometer data is predicated around byte data for X Y Z per
>> sample per motion sensor.
> 
>> Ultimately the "gesture recognition" action is about eating 200 3-byte
>> packets of data a second and issuing only one or two bytes per second
>> about any "gesture" that was seen.
> 
> The Spec document for the accelerometers says the refresh rate can be
> chosen: 100Hz, or 400Hz. The three values are stored as 2's complement
> number in one byte for each axis.

Yes.  In GTA02 the sensors are serviced by the CPU by separate
interrupts, so we have to eat the power consumption of 200
interrupts/sec... I figure 800 interrupts/sec might be a bit much.  So
currently it works at 100Hz.  Maybe it means higher frequency subtleties
are lost, I guess we can find out.

> Somebody in the thread at some point said:
>> Or a rotating shake with axis long side: it looks completely different when
>> device is upright, 45°, or flat (in fact this gesture isn't detectable at all
>> in upright position, in the first place).
>> Only correct way is to calculate real heading and velocity vector of device,
>> as accurate as possible. Then accumulate a "route", and this route you may
...
> With the two accelerometers, we can calculate the position(and the
> velocity) of the two accelerometers from a start point (position and
> velocity). But this is not enough to have the position of the whole
> phone in the space : we don't know the rotation movement along the
> axis defined by the two accelerometers.
...
> As we know that the relative position of the two accelerometers is
> fixed, it could help to detect calculation errors, and maybe correct
> them (a little...).

The deal is they are place like this

 /  <-- "top accel" at 45 degree angle at top of board
to left of transducer

_  <-- "bottom accel" unrotated to right of mic

Both times "pin 1" is towards the bottom left corner of the PCB.

> Regarding the work on a MPU, if I've understood what I've read on the
> mailing list archives, it's still just and idea, and the FreeRunner
> wont have one, am I right?

Right.  You have to use the main CPU there.

> For the GSoC, I think working on a simple library which uses the CPU
> would be already a good thing. (but we can work with the idea in mind
> that the code will need to be ported for a MPU).

Absolutely.

> The library could provide two things :
> * the recognition of typical gestures
> * the position and velocity evolutions in time

Maybe it makes sense to put this functionality into the Interrupt
service routine.  Because if we stay with raw accel data, the userspace
app blocking on /dev/input/event* can also woken at ~100Hz or so I guess
and it is not a great way for power saving.

- -Andy
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.7 (GNU/Linux)
Comment: Using GnuPG with Fedora - http://enigmail.mozdev.org

iD8DBQFH6S29OjLpvpq7dMoRAqmXAJ48xuxBbqGki3lhXgaBiJmMDWIKrACfRsJQ
YvMkuZSmcDaWP1Pui0kv648=
=r5u8
-END PGP SIGNATURE-

___
Openmoko community mailing list
community@lists.openmoko.org
http://lists.openmoko.org/mailman/listinfo/community


Re: GSoC 2008

2008-03-25 Thread Niluge kiwi
Somebody in the thread at some point said:
> The raw accelerometer data is predicated around byte data for X Y Z per
> sample per motion sensor.

> Ultimately the "gesture recognition" action is about eating 200 3-byte
> packets of data a second and issuing only one or two bytes per second
> about any "gesture" that was seen.

The Spec document for the accelerometers says the refresh rate can be
chosen: 100Hz, or 400Hz. The three values are stored as 2's complement
number in one byte for each axis.


Somebody in the thread at some point said:
> Or a rotating shake with axis long side: it looks completely different when
> device is upright, 45°, or flat (in fact this gesture isn't detectable at all
> in upright position, in the first place).
> Only correct way is to calculate real heading and velocity vector of device,
> as accurate as possible. Then accumulate a "route", and this route you may
> compare to a set of templates for best match (after transforming for size and
> orientation). All this is very low accuracy, so 16bit integer and sine-tables
> will do easily, but i don't think it will become much cheaper than this.
> That is, if the gestures are more complex than just "one sharp shake" "2
> gentle shakes" etc.

With the two accelerometers, we can calculate the position(and the
velocity) of the two accelerometers from a start point (position and
velocity). But this is not enough to have the position of the whole
phone in the space : we don't know the rotation movement along the
axis defined by the two accelerometers.
I didn't managed to open and view the FreeRunner hardware source
files(the software seems to be closed source and not free), so I don't
know the position of the two accelerometers on the phone, but I hope
they are well placed (so that we don't really need the unknown angle).
( I also tried to see the chips on the motherboard shots available on
the wiki, but didn't found them...)

As we know that the relative position of the two accelerometers is
fixed, it could help to detect calculation errors, and maybe correct
them (a little...).



Regarding the work on a MPU, if I've understood what I've read on the
mailing list archives, it's still just and idea, and the FreeRunner
wont have one, am I right?

For the GSoC, I think working on a simple library which uses the CPU
would be already a good thing. (but we can work with the idea in mind
that the code will need to be ported for a MPU).


The library could provide two things :
* the recognition of typical gestures
* the position and velocity evolutions in time

I don't know yet if it is necessary or not to calculate the latter to
obtain the first :it probably depends on the complexity of the
gestures to recognize, so we could divide the gestures in two groups :
the simples ones, like a fast acceleration in any direction, and more
complex ones: a Pi/2 rotation (landscape mode).
We should also use the "click" and "double click" recognition(in each
axis) already provided by the chip itself because it needs no cpu at
all.
The hardware also provide a free-fall detection.


If we could build such a library, it could allow to create so many
things ( the gestures for an easier interface, and the position and
velocity for games, but not only ).

___
Openmoko community mailing list
community@lists.openmoko.org
http://lists.openmoko.org/mailman/listinfo/community


gsoc 2008 moblog

2008-03-25 Thread David Laurell
Hello OpenMoko communtiy! I'm a student from Sweden and I'm interested 
in the moblog idea on google summer of code page on the wiki. Does this 
idea include coding on the server side or is there already a moblog that 
you should already code a client for?


Also how hard is it to learn to make applications in Java Me if you only 
are familiar Java SE?


/David Laurell
a student from sweden

___
Openmoko community mailing list
community@lists.openmoko.org
http://lists.openmoko.org/mailman/listinfo/community


Re: GSoC 2008

2008-03-25 Thread joerg
Am Di  25. März 2008 schrieb Andy Green:
> - gpg control packet
> Somebody in the thread at some point said:
> > Am Di  25. März 2008 schrieb Andy Green:
> > [...]
> >> It means that a perfect solution is predicated around
> >>
> >>  - 16 bit integer arithmetic -- ha "no float" is too easy
> >>  - per-sample short processing rather than large batching
> >>  - multiply is expensive (but hey shifting is cheap :-) )
> > Divide?
> > Hey, for normalizing vector direction of gestures to different orientation 
of
> > NEO, we will need trigonometric calculations, no?
> > Anyway, it shouldn't be much harder than trainable OCR of PALM 
(Tealscript?),
> > and i think we can compare the power of PalmCPU to that of MPU.
> >
> > If anything else fails, MPU has to buffer the G-meter data, and 
recognition of
> > the actual gesture has to be done on main CPU. (no real option)
> 
> Well it depends entirely on the algorithm and how smart we are.  If we
> can bend the algorithm to use right shift instead of divide, or small
> precomputed tables for the key actions, we can do it on a pretty weak
> device.
> 
> Ultimately the "gesture recognition" action is about eating 200 3-byte
> packets of data a second and issuing only one or two bytes per second
> about any "gesture" that was seen.  I guess one way or another we use
> ringbuffers to keep averages of actions in each axis and try to match
> that against templates.  Depending on the algorithm it can be the kind
> of thing that can be done on a 16MHz MPU.

It's not that simple, I guess. See "gesture" 'from_AT-EAR_to_LOOK-AT' (recent 
topic). It will be completely differently distributed on the 3 axis, 
depending on user being upright, lying in a chair or lying in bed (and 
there's right-hand/left-hand too ;-)
Or a rotating shake with axis long side: it looks completely different when 
device is upright, 45°, or flat (in fact this gesture isn't detectable at all 
in upright position, in the first place).
Only correct way is to calculate real heading and velocity vector of device, 
as accurate as possible. Then accumulate a "route", and this route you may 
compare to a set of templates for best match (after transforming for size and 
orientation). All this is very low accuracy, so 16bit integer and sine-tables 
will do easily, but i don't think it will become much cheaper than this.
That is, if the gestures are more complex than just "one sharp shake" "2 
gentle shakes" etc.

jOERG



___
Openmoko community mailing list
community@lists.openmoko.org
http://lists.openmoko.org/mailman/listinfo/community


Re: GSoC 2008

2008-03-25 Thread Andy Green
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Somebody in the thread at some point said:
> Am Di  25. März 2008 schrieb Andy Green:
> [...]
>> It means that a perfect solution is predicated around
>>
>>  - 16 bit integer arithmetic -- ha "no float" is too easy
>>  - per-sample short processing rather than large batching
>>  - multiply is expensive (but hey shifting is cheap :-) )
> Divide?
> Hey, for normalizing vector direction of gestures to different orientation of 
> NEO, we will need trigonometric calculations, no?
> Anyway, it shouldn't be much harder than trainable OCR of PALM (Tealscript?), 
> and i think we can compare the power of PalmCPU to that of MPU.
> 
> If anything else fails, MPU has to buffer the G-meter data, and recognition 
> of 
> the actual gesture has to be done on main CPU. (no real option)

Well it depends entirely on the algorithm and how smart we are.  If we
can bend the algorithm to use right shift instead of divide, or small
precomputed tables for the key actions, we can do it on a pretty weak
device.

Ultimately the "gesture recognition" action is about eating 200 3-byte
packets of data a second and issuing only one or two bytes per second
about any "gesture" that was seen.  I guess one way or another we use
ringbuffers to keep averages of actions in each axis and try to match
that against templates.  Depending on the algorithm it can be the kind
of thing that can be done on a 16MHz MPU.

- -Andy
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.7 (GNU/Linux)
Comment: Using GnuPG with Fedora - http://enigmail.mozdev.org

iD8DBQFH6NnVOjLpvpq7dMoRAswyAKCSMtBT67Cwl7Tqs96zCkrL1+nQYACfVS62
3I711OGjO9orSsE7spQIKa0=
=VPkk
-END PGP SIGNATURE-

___
Openmoko community mailing list
community@lists.openmoko.org
http://lists.openmoko.org/mailman/listinfo/community


Re: GSoC 2008

2008-03-25 Thread joerg
Am Di  25. März 2008 schrieb Andy Green:
[...]
> It means that a perfect solution is predicated around
> 
>  - 16 bit integer arithmetic -- ha "no float" is too easy
>  - per-sample short processing rather than large batching
>  - multiply is expensive (but hey shifting is cheap :-) )
Divide?
Hey, for normalizing vector direction of gestures to different orientation of 
NEO, we will need trigonometric calculations, no?
Anyway, it shouldn't be much harder than trainable OCR of PALM (Tealscript?), 
and i think we can compare the power of PalmCPU to that of MPU.

If anything else fails, MPU has to buffer the G-meter data, and recognition of 
the actual gesture has to be done on main CPU. (no real option)

jOERG


___
Openmoko community mailing list
community@lists.openmoko.org
http://lists.openmoko.org/mailman/listinfo/community


Re: GSoC 2008

2008-03-25 Thread Andy Green
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Somebody in the thread at some point said:
> Niluge KiWi wrote:
> 
>> With the two accelerometers in the FreeRunner, I think we can recognise
>> lots of gestures, not only simple ones like a "click" (which is already
>> recognised by the accelerometers used in the FreeRunner). The main
>> difficulty is probably to extract the useful data from the gestures
>> noise : calibration may take time. The goal is to have an almost
>> pre-calibrated library (an idea from the wish-list in the Wiki is to
>> allow the user to record its own gestures, but I think it's not easy to
>> do it simple for the end-user).
> Good idea, but consider to store calibration data separately. This will
> made the library more general. You want reuse it in other devices.
> So the "recorded" gestures.

There's another constraint on this, in the future a very weak 16-bit MPU
may have the duty to interpret the sensor data into gestures.  Whatever
we do for Freerunner and get it right, it would be perfect to port it to
this proposed future "sensor management" MPU.

It means that a perfect solution is predicated around

 - 16 bit integer arithmetic -- ha "no float" is too easy
 - per-sample short processing rather than large batching
 - multiply is expensive (but hey shifting is cheap :-) )

The raw accelerometer data is predicated around byte data for X Y Z per
sample per motion sensor.

- -Andy
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.7 (GNU/Linux)
Comment: Using GnuPG with Fedora - http://enigmail.mozdev.org

iD8DBQFH6LgHOjLpvpq7dMoRArXRAJ942L+fzsXHq+SYmbufUKHLR17vbQCeOe0B
SVs4RTB7dHTX3n+TyLxjsAQ=
=+SMU
-END PGP SIGNATURE-

___
Openmoko community mailing list
community@lists.openmoko.org
http://lists.openmoko.org/mailman/listinfo/community


Re: GSoC 2008

2008-03-25 Thread ewanm89
On Mon, 24 Mar 2008 19:02:34 +0100
Niluge KiWi <[EMAIL PROTECTED]> wrote:

> Hi,
> 
> I'm a student in the french engineer school ENSIMAG, and I would like
> to work for OpenMoko during the Google Summer of Code.
> 
> I'm interested in the accelerometers features [1]: Recognising
> gestures is a really important part of the interface between the user
> and the phone.
> With the two accelerometers in the FreeRunner, I think we can
> recognise lots of gestures, not only simple ones like a
> "click" (which is already recognised by the accelerometers used in
> the FreeRunner). The main difficulty is probably to extract the
> useful data from the gestures noise : calibration may take time. The
> goal is to have an almost pre-calibrated library (an idea from the
> wish-list in the Wiki is to allow the user to record its own
> gestures, but I think it's not easy to do it simple for the end-user).
> 
> The accelerometers could provide not only small gestures recognition
> (like the ones listed on the Wiki: up-side-down, shaking,
> flipping, ...), but full 3D-space positioning from a start position
> (when the software is started).
> 
> Then we can imagine lots of uses of the library : improvements in the
> control of the phone, programs specially created to use such
> control(little games for examples).
> 
> The accelerometers gestures could be combined with the touchscreen
> for a better use.
> For example, the gesture navigation can be activated only when
> pressing the screen:
> if we are viewing a large picture, zoomed in, we could move through it
> by moving the phone, but we don't want it moves all the time.
> 
> Other examples given on the Wiki [2] could be implemented by using the
> library.
> 
> 
> I looked at the driver for the accelerometers, and it seems it's not
> yet working. I don't think I'm able to work on the driver, so I hope
> it will work this summer.
> 

Who would need multitouch when we have this? Sounds great to me.

-- 
Ewan Marshall (ewanm89/Cap_J_L_Picard on irc)

http://ewanm89.co.uk/
Geek by nature, Linux by choice.


signature.asc
Description: PGP signature
___
Openmoko community mailing list
community@lists.openmoko.org
http://lists.openmoko.org/mailman/listinfo/community


Re: GSoC 2008

2008-03-24 Thread Evgeny Ginzburg

Niluge KiWi wrote:


With the two accelerometers in the FreeRunner, I think we can recognise
lots of gestures, not only simple ones like a "click" (which is already
recognised by the accelerometers used in the FreeRunner). The main
difficulty is probably to extract the useful data from the gestures
noise : calibration may take time. The goal is to have an almost
pre-calibrated library (an idea from the wish-list in the Wiki is to
allow the user to record its own gestures, but I think it's not easy to
do it simple for the end-user).
Good idea, but consider to store calibration data separately. This will 
made the library more general. You want reuse it in other devices.

So the "recorded" gestures.


The accelerometers could provide not only small gestures recognition
(like the ones listed on the Wiki: up-side-down, shaking,
flipping, ...), but full 3D-space positioning from a start position
(when the software is started).


So long, and thank for all the fish.
Evgeny.

___
Openmoko community mailing list
community@lists.openmoko.org
http://lists.openmoko.org/mailman/listinfo/community


GSoC 2008

2008-03-24 Thread Niluge KiWi
Hi,

I'm a student in the french engineer school ENSIMAG, and I would like to
work for OpenMoko during the Google Summer of Code.

I'm interested in the accelerometers features [1]: Recognising gestures
is a really important part of the interface between the user and the
phone.
With the two accelerometers in the FreeRunner, I think we can recognise
lots of gestures, not only simple ones like a "click" (which is already
recognised by the accelerometers used in the FreeRunner). The main
difficulty is probably to extract the useful data from the gestures
noise : calibration may take time. The goal is to have an almost
pre-calibrated library (an idea from the wish-list in the Wiki is to
allow the user to record its own gestures, but I think it's not easy to
do it simple for the end-user).

The accelerometers could provide not only small gestures recognition
(like the ones listed on the Wiki: up-side-down, shaking,
flipping, ...), but full 3D-space positioning from a start position
(when the software is started).

Then we can imagine lots of uses of the library : improvements in the
control of the phone, programs specially created to use such
control(little games for examples).

The accelerometers gestures could be combined with the touchscreen for a
better use.
For example, the gesture navigation can be activated only when pressing
the screen:
if we are viewing a large picture, zoomed in, we could move through it
by moving the phone, but we don't want it moves all the time.

Other examples given on the Wiki [2] could be implemented by using the
library.


I looked at the driver for the accelerometers, and it seems it's not yet
working. I don't think I'm able to work on the driver, so I hope it will
work this summer.


I'm also interested in working in the ambient noise detection in second
choice.


-
[1]
http://wiki.openmoko.org/wiki/Summer_of_Code_2008#Accelerometer_Gestures
[2] http://wiki.openmoko.org/wiki/Wish_List#Accelerometer_wishes
-

I hope I'll be part of the OpenMoko project this summer,
Thomas Riccardi


___
Openmoko community mailing list
community@lists.openmoko.org
http://lists.openmoko.org/mailman/listinfo/community


Re: GSoC 2008

2008-03-23 Thread Sean Moss-Pultz

ewanm89 wrote:

On Sun, 23 Mar 2008 17:39:29 -0300
Stefan Schmidt <[EMAIL PROTECTED]> wrote:


What exactly do you mean here? If Freerunner will be available at the
time, or if Openmoko provide students the hardware?


Well obviously the hardware costs money that I don't expect to just
come from nowhere, I was more worried about actually making sure it got
to those of us who are accepted in time.


Don't worry. GSoC is extremely important to us. We will make sure all 
the people who participate have FreeRunners.


Even if they are only engineering samples, they still will work fine for 
development.


  Sean

___
Openmoko community mailing list
community@lists.openmoko.org
http://lists.openmoko.org/mailman/listinfo/community


Re: GSoC 2008

2008-03-23 Thread ewanm89
On Sun, 23 Mar 2008 17:39:29 -0300
Stefan Schmidt <[EMAIL PROTECTED]> wrote:

> What exactly do you mean here? If Freerunner will be available at the
> time, or if Openmoko provide students the hardware?

Well obviously the hardware costs money that I don't expect to just
come from nowhere, I was more worried about actually making sure it got
to those of us who are accepted in time.

-- 
Ewan Marshall (ewanm89/Cap_J_L_Picard on irc)

http://ewanm89.co.uk/
Geek by nature, Linux by choice.


signature.asc
Description: PGP signature
___
Openmoko community mailing list
community@lists.openmoko.org
http://lists.openmoko.org/mailman/listinfo/community


Re: GSoC 2008

2008-03-23 Thread Stefan Schmidt
Hello.

On Sun, 2008-03-23 at 06:24, ewanm89 wrote:
> I'm heavily interested in apply to get the ad-hoc communication going
> for GSoC 2008.

Great. We had a try last year, but it failed. Nice to see it going
again.

> I just wondered what is the situation on getting the
> hardware?

What exactly do you mean here? If Freerunner will be available at the
time, or if Openmoko provide students the hardware?

regards
Stefan Schmidt


signature.asc
Description: Digital signature
___
Openmoko community mailing list
community@lists.openmoko.org
http://lists.openmoko.org/mailman/listinfo/community


Re: GSoC 2008

2008-03-23 Thread Ilja O.
Seems that hardware is quite likely to become available before summer.
(This is only my opinion and I'm not related to Openmoko otherwise
than being listed in community and kernel mailing lists).

___
Openmoko community mailing list
community@lists.openmoko.org
http://lists.openmoko.org/mailman/listinfo/community


GSoC 2008

2008-03-22 Thread ewanm89
I'm heavily interested in apply to get the ad-hoc communication going
for GSoC 2008.
I just wondered what is the situation on getting the
hardware?

-- 
Ewan Marshall (ewanm89)

Geek by nature, Linux by choice.


signature.asc
Description: PGP signature
___
Openmoko community mailing list
community@lists.openmoko.org
http://lists.openmoko.org/mailman/listinfo/community


GSoC 2008

2008-03-19 Thread Michael 'Mickey' Lauer
Hi guys, 

as you may have already noticed, Openmoko Inc. has been accepted
as a mentoring organization for the Google Summer of Code 2008.

According to the timeline, the student application period begins Monday,
March 24, 2008, and ends Monday, March 31st. Please prepare to apply as
soon as possible, so we can use the one week interval to refine and
focus your proposal.

Please note that the list of ideas found on the wiki page is by no means
comprehensive, it's rather a bunch of things we think would be cool. If
you come up with even cooler stuff, be our guest :)

As for the amount of actual program slots we get assigned from google, I
have no idea yet...

Cheers,

:M:


___
OpenMoko community mailing list
community@lists.openmoko.org
http://lists.openmoko.org/mailman/listinfo/community


Re: GSoC 2008: Call for ideas

2008-03-12 Thread Jens Fursund
As I am not an owner of a Neo myself, I do not if this has already been
done.
What about making it possible to listen music, and as a call comes in(and
you answer it), make the music pause. I know this is a small thing, though I
don't know how easy it is to implement, but I think it would be quite neat!

Best Regards,

Jens

__
> > OpenMoko community mailing list
> > community@lists.openmoko.org
> > http://lists.openmoko.org/mailman/listinfo/community
>
>
> ___
> OpenMoko community mailing list
> community@lists.openmoko.org
> http://lists.openmoko.org/mailman/listinfo/community
>
___
OpenMoko community mailing list
community@lists.openmoko.org
http://lists.openmoko.org/mailman/listinfo/community


Re: GSoC 2008: Call for ideas

2008-03-11 Thread Marcus Bauer
already there, thanks to openembedded:

http://buildhost.openmoko.org/daily/neo1973/deploy/glibc/ipk/armv4t/sqlite3_3.5.6-r0_armv4t.ipk

On Tue, 2008-03-11 at 22:52 +0200, Ilja O. wrote:
> Sqlite port would be nice thing to have. (With C/C++/Python bindings, of 
> course)
> 
> ___
> OpenMoko community mailing list
> community@lists.openmoko.org
> http://lists.openmoko.org/mailman/listinfo/community


___
OpenMoko community mailing list
community@lists.openmoko.org
http://lists.openmoko.org/mailman/listinfo/community


Re: GSoC 2008: Call for ideas

2008-03-11 Thread Ilja O.
Sqlite port would be nice thing to have. (With C/C++/Python bindings, of course)

___
OpenMoko community mailing list
community@lists.openmoko.org
http://lists.openmoko.org/mailman/listinfo/community


Re: GSoC 2008: Call for ideas

2008-03-11 Thread Michele Renda
I was thinking to a port of AirStrike (http://icculus.org/airstrike/) but
there is the need of SDL-image port

I am waiting for FreeRunner :)


>
> ___
> OpenMoko community mailing list
> community@lists.openmoko.org
> http://lists.openmoko.org/mailman/listinfo/community
>
___
OpenMoko community mailing list
community@lists.openmoko.org
http://lists.openmoko.org/mailman/listinfo/community


Re: GSoC 2008: Call for ideas

2008-03-10 Thread Ilja O.
Hello.
In my opinion, there are some highly usable project proposals in wish
list, that could be done by student (like me, heh-heh-heh...) during
summer.
First things first: platform should provide more than one GUI binding
solutions.
In my opinion binding framework porting priority is (decreasing):
1) qt
2) wx
3) sdl
(Since Maemo/Cocoa bindins are not widely used in current world, I
think that these bindings are in
would-be-nice-to-have[when-we'll-have-free-time] class)
In my opinion, GUI bindings are great projects for GSoc, since they
are useful and easy to create (at least it looks so, since Openmoko
has standard GTK base, that by these frameworks already can use on the
PC).

Also, C++ bindings are must-have (but I have no clue about how mush
work it will be to implement them)
Python binding is good-to-have thing, since Python language is great
for prototyping (but I don't think that it would be great idea to
write real application on Openmoko due to embedded platform
limitations).

Also, platform *must* include high-level bindings for standard phone
functionality, like sending SMS to the given number (with given text,
of course), making a call, cancelling call, getting GPS  coordinates,
sending visit card using bluetooth, bt device pairing, connecting to
bt device... it's quite easy to build such list. But some has to
implement it all. (IMHO, all tese API functions should be accessible
via dbus). As my device driver programming experience shows, it is
quite possible to accomplish such thing during summer (not all at
once, of course. Someone should be playing with bt, someone with gsm
module... But I assume that you are much more aware of all these
organization issues than me.

Btw, "please, please, release freerunner! In March! Can't hold... much
longer"

___
OpenMoko community mailing list
community@lists.openmoko.org
http://lists.openmoko.org/mailman/listinfo/community


Re: GSoC 2008: Call for ideas

2008-03-09 Thread Lally Singh
Cool, can we use it directly or need a port?

On Sun, Mar 9, 2008 at 4:03 PM, Joachim Steiger <[EMAIL PROTECTED]> wrote:
> Lally Singh wrote:
>  > For middleware, it looks like all that's really needed is an OM
>  > version of Cocoa's NSNotificationCenter.  IMHO I think it's a great
>  > place to start -- just a filtered event channel with a C-callable API
>  > for publishing/listening for events.  I'd prefer a design that's
>  > simple & reliable for small data packets over one that's overdesigned
>  > for the 1% of the time when you want to transfer bulk data.
>
>  thats there already. its called d-bus and used by lots of gtk/gnome/glib
>  apps already
>
>  please take a look at
>
>  http://www.freedesktop.org/wiki/Software/dbus
>
>  thanks for your suggestion anyways
>
>
>  kind regards
>
>  --
>
>  Joachim Steiger
>  developer relations/support
>



-- 
H. Lally Singh
Ph.D. Candidate, Computer Science
Virginia Tech

___
OpenMoko community mailing list
community@lists.openmoko.org
http://lists.openmoko.org/mailman/listinfo/community


Re: GSoC 2008: Call for ideas

2008-03-09 Thread Joachim Steiger
Lally Singh wrote:
> For middleware, it looks like all that's really needed is an OM
> version of Cocoa's NSNotificationCenter.  IMHO I think it's a great
> place to start -- just a filtered event channel with a C-callable API
> for publishing/listening for events.  I'd prefer a design that's
> simple & reliable for small data packets over one that's overdesigned
> for the 1% of the time when you want to transfer bulk data.

thats there already. its called d-bus and used by lots of gtk/gnome/glib
apps already

please take a look at

http://www.freedesktop.org/wiki/Software/dbus

thanks for your suggestion anyways


kind regards

-- 

Joachim Steiger
developer relations/support

___
OpenMoko community mailing list
community@lists.openmoko.org
http://lists.openmoko.org/mailman/listinfo/community


Re: GSoC 2008: Call for ideas

2008-03-09 Thread Lally Singh
For middleware, it looks like all that's really needed is an OM
version of Cocoa's NSNotificationCenter.  IMHO I think it's a great
place to start -- just a filtered event channel with a C-callable API
for publishing/listening for events.  I'd prefer a design that's
simple & reliable for small data packets over one that's overdesigned
for the 1% of the time when you want to transfer bulk data.

If I may suggest a starting point, something readable like XML would
be preferable over some binary representation of data.  That way,
later it could be extended to interact with web services. Also a *lot*
easier to debug, and plenty of small/simple and large/complex
libraries available to parse it.

With that said, the application API should have two layers: an
XML-based I/O layer that does all the low-level work, and a
string-based convenience library that does most jobs an app would
want, without having to make them muck around with the XML format
unless they had to get fancy.  It'll also help standardize the formats
of the events easier -- as most would just use the facilities of the
higher-level API.  Also, remember to standardize common event names
with #defines in some header files.  It'd be nice to avoid
spelling-related bugs.

The highest-level api would be something this simple:
event_listen ( POWER_MANAGER, POWER_LOW, &handle_power_low );

event_listen would call two APIs:
Document *req = build_standard_request (POWER_MANAGER, POWER_LOW);
listen_event ( req, &handle_power_low )

(Where Document is just a simple DOM tree) If someone wanted to put
together a more complex event than build_standard_request does, they
could build it themselves and send it over.  Or, event_listen could
just be equivalent to these APIs, but really just do an sprintf() and
send the result over the wire itself.  As long as both options are
available to the developer, I'm happy. The interface is what matters
to me -- the implementation is a lot easier to change once apps are
built atop of it.

It looks like another convenience API to emit a signal when an event
occurs would also be useful.   The simplest method for doing so would
be to pass a signal-emitting function into event_listen.

As for a test app, a pair of programs that talk through the middleware
would be useful.  Especially if they used different message types and
handled them differently, so that the client-side API is given a fair
shake.  Also, one as a daemon, and one as a graphical app.  Should get
the APIs pretty well vetted.


On Sun, Mar 9, 2008 at 7:00 AM, Michael 'Mickey' Lauer
<[EMAIL PROTECTED]> wrote:
> Hi folks,
>
>  OpenMoko Inc. will apply for Google SoC 2008.
>
>  For a successfull application, we need to have a comprehensive list of ideas.
>
>  A first scratch at that is @
>  http://wiki.openmoko.org/wiki/Summer_of_Code_2008 -- please help us adding to
>  that.
>
>  Some rough guidelines:
>
>  * Please add only ideas that can be realized within the time constraints of
>  google SoC.
>
>  * Please add only ideas that end up in a software deliverable, e.g. an
>  application or a library.
>
>  * As OM lacks in middleware, please prefer middleware projects before
>  application level projects. However, as every part of middleware should be
>  application driven, the middleware deliverables should always include a demo
>  application.
>
>  I want to compile the mentoring application on Tuesday, 11th -- please make
>  sure all your great ideas are in by then.
>
>  Cheers,
>
>  Mickey.
>
>  ___
>  OpenMoko community mailing list
>  community@lists.openmoko.org
>  http://lists.openmoko.org/mailman/listinfo/community
>



-- 
H. Lally Singh
Ph.D. Candidate, Computer Science
Virginia Tech

___
OpenMoko community mailing list
community@lists.openmoko.org
http://lists.openmoko.org/mailman/listinfo/community


GSoC 2008: Call for ideas

2008-03-09 Thread Michael 'Mickey' Lauer
Hi folks,

OpenMoko Inc. will apply for Google SoC 2008.

For a successfull application, we need to have a comprehensive list of ideas. 

A first scratch at that is @ 
http://wiki.openmoko.org/wiki/Summer_of_Code_2008 -- please help us adding to 
that.

Some rough guidelines:

* Please add only ideas that can be realized within the time constraints of 
google SoC.

* Please add only ideas that end up in a software deliverable, e.g. an 
application or a library.

* As OM lacks in middleware, please prefer middleware projects before 
application level projects. However, as every part of middleware should be 
application driven, the middleware deliverables should always include a demo 
application.

I want to compile the mentoring application on Tuesday, 11th -- please make 
sure all your great ideas are in by then.

Cheers,

Mickey.

___
OpenMoko community mailing list
community@lists.openmoko.org
http://lists.openmoko.org/mailman/listinfo/community