On Sat, September 16, 2006 9:05 pm, Debajyoti Bera wrote:
>> 8.) Build a plugin mechanism in the UI and libbeagle
> What do you mean by plugin for libbeagle ? What kind of plugins are you
> looking for ?
I ment to say that the plug-ins should be controlable from the GUI, CLI,
and libbeagle. I fin
> Also, I forgot to add:
>
> 7.) Define a plugin model for queryables to be developed and distributed
> outside of beagle.
It is possible to build a queryable outside of beagle. its just not well
documented. Assuming you understand the distinction between a QueryDriver
(one which queries existing
Also, I forgot to add:
7.) Define a plugin model for queryables to be developed and distributed
outside of beagle.
8.) Build a plugin mechanism in the UI and libbeagle
Adam
___
Dashboard-hackers mailing list
Dashboard-hackers@gnome.org
http://mail.gnome
Joe,
After mulling over this issue I believe that I have finally found a way to
express my point of view. Bealge currently addresses the "personal
information space" problem. This is a big problem and I understand it
personally as I have 2TB of disk under my desk right now. With my user
hat on
Hi,
On Thu, 2006-09-14 at 13:54 -0500, Adam T. Gautier wrote:
> On Thu, September 14, 2006 1:06 pm, Joe Shaw wrote:
> > In general I am rather nervous about indexing remote resources for a
> > number of reasons:
>
> Yes, this is understandable. However, I assume that the QueryDomain was
> ment t
On Thu, September 14, 2006 1:06 pm, Joe Shaw wrote:
> In general I am rather nervous about indexing remote resources for a
> number of reasons:
Yes, this is understandable. However, I assume that the QueryDomain was
ment to adress this.
>
> * Doing so can be extremely taxing on the networ
Hi,
On Thu, 2006-09-14 at 10:51 -0500, Adam T. Gautier wrote:
> After scanning the code for the MozillaQueryable it seems that the code
> deals with the mail client but there does not seem to be code for indexing
> the bookmarks.
The MozillaQueryable code has been dead for quite some time now.
ble
then submit (all combinations of query parts that != keyword(s)) requests
to URL associated with keyword.
* Evaluate responses for correctness (HTTP:200:OK)
* Return all correct responses as hits ranked by number of query parts
used in request
Assunmptions:
* There currently is not a sp