[web2py] Re: Watching a table for changes

2012-06-12 Thread Anthony

>
> Since I do not want to add to each insert the appending
> of the function - I simply added it after creating the table
> in the model, and now it works well since it is being executed
> together with the model code. 
>
> Are there good reasons not to do it?
>
> Since I use web2py mainly as a back-end to JSON-RPC calls and gateway to 
> the database,
> my application is basically built out of a huge default controller and a 
> huge model file. I don't
> think I would know where to put it otherwise.
>

Sure, putting in the model file where you create the table makes sense.

Anthony 


[web2py] Re: Watching a table for changes

2012-06-12 Thread Omri Har-Shemesh
Thanks for all your help!

Now I understand web2py a little better :-)
Since I do not want to add to each insert the appending
of the function - I simply added it after creating the table
in the model, and now it works well since it is being executed
together with the model code. 

Are there good reasons not to do it?

Since I use web2py mainly as a back-end to JSON-RPC calls and gateway to 
the database,
my application is basically built out of a huge default controller and a 
huge model file. I don't
think I would know where to put it otherwise.

Best wishes,
Omri

On Monday, June 11, 2012 6:25:15 PM UTC+2, Anthony wrote:
>
> Model (i.e., DAL Table) definitions do not survive across requests -- 
> during a request, you define the table, and then you use the table object 
> to access the db, etc. In your workflow, you are adding the _after_insert 
> function in one request (i.e., the RPC request) but doing the insert in a 
> separate request. This is not how web2py works -- you have to add the 
> _after_insert function in the same request as the insert.
>
> More generally, keep in mind that your application code is executed on 
> each request -- whatever you define in a given request is specific to that 
> request and does not persist to subsequent requests (unless, of course, you 
> explicitly store it in the cache, session, db, filesystem, etc.).
>
> Anthony
>
> On Monday, June 11, 2012 11:56:36 AM UTC-4, Omri Har-Shemesh wrote:
>>
>> In order to attach the function, I have a function which is being called 
>> through JSON-RPC, and looks like this:
>>
>> @service.jsonrpc
>> def append_listeners():
>>print db.comments._after_insert  # prints []
>>db.comments._after_insert.append(MyFunction)
>>print db.comments._after_insert  # prints [> 0xabfb8df4>]
>>
>> No matter how many times I call this function, the print statements stay 
>> the same (except for the address of the function MyFunction).
>> I modified the insert function in dal.py for debugging:
>> def insert(self, **fields):
>> print "_after_insert array: ", self._after_insert  # prints 
>> _after_insert array:  []
>> print "table name: ", self._tablename # prints table name: 
>> comments
>> self._attempt_upload(fields)
>> if any(f(fields) for f in self._before_insert): return 0
>> ret =  self._db._adapter.insert(self,self._listify(fields))
>> ret and [f(fields,ret) for f in self._after_insert]
>> return ret
>>
>> and indeed, when I insert something to the comments table, I get two 
>> lines printed out as shown in the comments text.
>> It seems that the additional function is not saved in the table object.
>>
>> Am I doing something fundamentally wrong? Should I add the function from 
>> the model file? Or at any other place?
>>
>> Thanks,
>> Omri
>>
>>
>>
>> On Monday, June 11, 2012 5:33:53 PM UTC+2, Anthony wrote:
>>>
>>> I tried adding a method using:
 def SomeMethod(fields,res):
 print "inside SomeMethod"

 db.my_table._after_insert.append(SomeMethod)

 but nothing seems to happen. Is it possible that the implementation of 
 the adapters
 does not call the methods inside _after_insert, _after_update and 
 _after_delete?

>>>
>>> I would think it should work. Have you also defined any _before_insert 
>>> functions -- if they return anything, the insert will be aborted? Also, are 
>>> you sure the insert is happening -- if not, the _after_insert functions 
>>> won't run. You may need to show more code.
>>>
>>> Anthony 
>>>
>>

[web2py] Re: Watching a table for changes

2012-06-11 Thread Anthony
Model (i.e., DAL Table) definitions do not survive across requests -- 
during a request, you define the table, and then you use the table object 
to access the db, etc. In your workflow, you are adding the _after_insert 
function in one request (i.e., the RPC request) but doing the insert in a 
separate request. This is not how web2py works -- you have to add the 
_after_insert function in the same request as the insert.

More generally, keep in mind that your application code is executed on each 
request -- whatever you define in a given request is specific to that 
request and does not persist to subsequent requests (unless, of course, you 
explicitly store it in the cache, session, db, filesystem, etc.).

Anthony

On Monday, June 11, 2012 11:56:36 AM UTC-4, Omri Har-Shemesh wrote:
>
> In order to attach the function, I have a function which is being called 
> through JSON-RPC, and looks like this:
>
> @service.jsonrpc
> def append_listeners():
>print db.comments._after_insert  # prints []
>db.comments._after_insert.append(MyFunction)
>print db.comments._after_insert  # prints [ 0xabfb8df4>]
>
> No matter how many times I call this function, the print statements stay 
> the same (except for the address of the function MyFunction).
> I modified the insert function in dal.py for debugging:
> def insert(self, **fields):
> print "_after_insert array: ", self._after_insert  # prints 
> _after_insert array:  []
> print "table name: ", self._tablename # prints table name: 
> comments
> self._attempt_upload(fields)
> if any(f(fields) for f in self._before_insert): return 0
> ret =  self._db._adapter.insert(self,self._listify(fields))
> ret and [f(fields,ret) for f in self._after_insert]
> return ret
>
> and indeed, when I insert something to the comments table, I get two lines 
> printed out as shown in the comments text.
> It seems that the additional function is not saved in the table object.
>
> Am I doing something fundamentally wrong? Should I add the function from 
> the model file? Or at any other place?
>
> Thanks,
> Omri
>
>
>
> On Monday, June 11, 2012 5:33:53 PM UTC+2, Anthony wrote:
>>
>> I tried adding a method using:
>>> def SomeMethod(fields,res):
>>> print "inside SomeMethod"
>>>
>>> db.my_table._after_insert.append(SomeMethod)
>>>
>>> but nothing seems to happen. Is it possible that the implementation of 
>>> the adapters
>>> does not call the methods inside _after_insert, _after_update and 
>>> _after_delete?
>>>
>>
>> I would think it should work. Have you also defined any _before_insert 
>> functions -- if they return anything, the insert will be aborted? Also, are 
>> you sure the insert is happening -- if not, the _after_insert functions 
>> won't run. You may need to show more code.
>>
>> Anthony 
>>
>

[web2py] Re: Watching a table for changes

2012-06-11 Thread Omri Har-Shemesh
In order to attach the function, I have a function which is being called 
through JSON-RPC, and looks like this:

@service.jsonrpc
def append_listeners():
   print db.comments._after_insert  # prints []
   db.comments._after_insert.append(MyFunction)
   print db.comments._after_insert  # prints []

No matter how many times I call this function, the print statements stay 
the same (except for the address of the function MyFunction).
I modified the insert function in dal.py for debugging:
def insert(self, **fields):
print "_after_insert array: ", self._after_insert  # prints 
_after_insert array:  []
print "table name: ", self._tablename # prints table name: comments
self._attempt_upload(fields)
if any(f(fields) for f in self._before_insert): return 0
ret =  self._db._adapter.insert(self,self._listify(fields))
ret and [f(fields,ret) for f in self._after_insert]
return ret

and indeed, when I insert something to the comments table, I get two lines 
printed out as shown in the comments text.
It seems that the additional function is not saved in the table object.

Am I doing something fundamentally wrong? Should I add the function from 
the model file? Or at any other place?

Thanks,
Omri



On Monday, June 11, 2012 5:33:53 PM UTC+2, Anthony wrote:
>
> I tried adding a method using:
>> def SomeMethod(fields,res):
>> print "inside SomeMethod"
>>
>> db.my_table._after_insert.append(SomeMethod)
>>
>> but nothing seems to happen. Is it possible that the implementation of 
>> the adapters
>> does not call the methods inside _after_insert, _after_update and 
>> _after_delete?
>>
>
> I would think it should work. Have you also defined any _before_insert 
> functions -- if they return anything, the insert will be aborted? Also, are 
> you sure the insert is happening -- if not, the _after_insert functions 
> won't run. You may need to show more code.
>
> Anthony 
>


[web2py] Re: Watching a table for changes

2012-06-11 Thread Anthony

>
> I tried adding a method using:
> def SomeMethod(fields,res):
> print "inside SomeMethod"
>
> db.my_table._after_insert.append(SomeMethod)
>
> but nothing seems to happen. Is it possible that the implementation of the 
> adapters
> does not call the methods inside _after_insert, _after_update and 
> _after_delete?
>

I would think it should work. Have you also defined any _before_insert 
functions -- if they return anything, the insert will be aborted? Also, are 
you sure the insert is happening -- if not, the _after_insert functions 
won't run. You may need to show more code.

Anthony 


[web2py] Re: Watching a table for changes

2012-06-11 Thread Anthony
Here's the full code for insert:

def insert(self, **fields):
self._attempt_upload(fields)
if any(f(fields) for f in self._before_insert): return 0
ret = self._db._adapter.insert(self,self._listify(fields))
ret and [f(fields,ret) for f in self._after_insert]
return ret

ret is the id of the inserted record (or 0 if the insert fails). ret is 
passed to the _after_insert functions in case those functions need to 
reference the inserted record for any reason. It does not appear the 
functions need to return anything. The reason for ret and [f(fields, ret)...is 
so the _after_insert functions only get executed if the insert was 
successful. If the insert failed, ret will be 0, which will evaluate to 
False, which will prevent the part after the "and" from being evaluated 
(i.e., the functions won't be called).

Note, if any of the _before_insert functions return anything that evaluates 
to True (i.e., any values other than False, None, 0, or an empty string, 
list, dict, etc.), then the insert will be abandoned (i.e., 0 is returned 
immediately). So, the _before_insert functions can be used to prevent an 
insert if desired.

Anthony

On Monday, June 11, 2012 9:06:35 AM UTC-4, Omri Har-Shemesh wrote:
>
> Hi Anthony,
>
> thanks for the reply! 
> I have been looking at the code and there is something I don't really 
> understand (sorry for a noob question):
> when the functions get called, they are called using a pattern:
>
> ret = ...
> ret and [f(fields,ret) for f in self._after_insert]
> return ret
>
> The question is - should the function I define return something?
> Is the result of the function call used somehow, somewhere?
>
> I suspect that using ret and [] is simply in order to invoke the functions,
> but then I don't understand why not use something like
> [f(fields,ret) for f in self._after_insert]
> without using the and operator.
>
> Best wishes,
> Omri
>
> On Wednesday, June 6, 2012 3:44:46 PM UTC+2, Anthony wrote:
>>
>> In trunk, there are now triggers for DAL tables:
>>
>> self._before_insert = []
>> self._before_update = [lambda self,fs:self.delete_uploaded_files(
>> fs)]
>> self._before_delete = [lambda self:self.delete_uploaded_files()]
>> self._after_insert = []
>> self._after_update = []
>> self._after_delete = []
>>
>> You could define a function that stores a timestamp in the cache and add 
>> it to the three "_after_" triggers for the relevant tables. Then just check 
>> the timestamp to see if it is more recent than the last request. To 
>> determine the time of the last request, you could store a timestamp in the 
>> session upon each request (so it will be specific to each user).
>>
>> Anthony
>>
>> On Wednesday, June 6, 2012 5:30:56 AM UTC-4, Omri Har-Shemesh wrote:
>>>
>>> Is it possible to watch a table for changes and define a callback 
>>> whenever the table is changed?
>>> I am trying to implement a mechanism where the client is asking for data 
>>> every 5 seconds from the server
>>> and the server should only return not a null value whenever the data has 
>>> changed. The problem is that
>>> the data is quite complex and is inferred from several tables. 
>>> Therefore, in order to save processing time,
>>> I want to calculate this data and return it only when I know it has 
>>> changed.
>>>
>>> Thanks,
>>> Omri
>>>
>>

[web2py] Re: Watching a table for changes

2012-06-11 Thread Omri Har-Shemesh
Hi Again,

I tried adding a method using:
def SomeMethod(fields,res):
print "inside SomeMethod"

db.my_table._after_insert.append(SomeMethod)

but nothing seems to happen. Is it possible that the implementation of the 
adapters
does not call the methods inside _after_insert, _after_update and 
_after_delete?

Best wishes,
Omri



On Monday, June 11, 2012 3:06:35 PM UTC+2, Omri Har-Shemesh wrote:
>
> Hi Anthony,
>
> thanks for the reply! 
> I have been looking at the code and there is something I don't really 
> understand (sorry for a noob question):
> when the functions get called, they are called using a pattern:
>
> ret = ...
> ret and [f(fields,ret) for f in self._after_insert]
> return ret
>
> The question is - should the function I define return something?
> Is the result of the function call used somehow, somewhere?
>
> I suspect that using ret and [] is simply in order to invoke the functions,
> but then I don't understand why not use something like
> [f(fields,ret) for f in self._after_insert]
> without using the and operator.
>
> Best wishes,
> Omri
>
> On Wednesday, June 6, 2012 3:44:46 PM UTC+2, Anthony wrote:
>>
>> In trunk, there are now triggers for DAL tables:
>>
>> self._before_insert = []
>> self._before_update = [lambda self,fs:self.delete_uploaded_files(
>> fs)]
>> self._before_delete = [lambda self:self.delete_uploaded_files()]
>> self._after_insert = []
>> self._after_update = []
>> self._after_delete = []
>>
>> You could define a function that stores a timestamp in the cache and add 
>> it to the three "_after_" triggers for the relevant tables. Then just check 
>> the timestamp to see if it is more recent than the last request. To 
>> determine the time of the last request, you could store a timestamp in the 
>> session upon each request (so it will be specific to each user).
>>
>> Anthony
>>
>> On Wednesday, June 6, 2012 5:30:56 AM UTC-4, Omri Har-Shemesh wrote:
>>>
>>> Is it possible to watch a table for changes and define a callback 
>>> whenever the table is changed?
>>> I am trying to implement a mechanism where the client is asking for data 
>>> every 5 seconds from the server
>>> and the server should only return not a null value whenever the data has 
>>> changed. The problem is that
>>> the data is quite complex and is inferred from several tables. 
>>> Therefore, in order to save processing time,
>>> I want to calculate this data and return it only when I know it has 
>>> changed.
>>>
>>> Thanks,
>>> Omri
>>>
>>

[web2py] Re: Watching a table for changes

2012-06-11 Thread Omri Har-Shemesh
Hi Anthony,

thanks for the reply! 
I have been looking at the code and there is something I don't really 
understand (sorry for a noob question):
when the functions get called, they are called using a pattern:

ret = ...
ret and [f(fields,ret) for f in self._after_insert]
return ret

The question is - should the function I define return something?
Is the result of the function call used somehow, somewhere?

I suspect that using ret and [] is simply in order to invoke the functions,
but then I don't understand why not use something like
[f(fields,ret) for f in self._after_insert]
without using the and operator.

Best wishes,
Omri

On Wednesday, June 6, 2012 3:44:46 PM UTC+2, Anthony wrote:
>
> In trunk, there are now triggers for DAL tables:
>
> self._before_insert = []
> self._before_update = [lambda self,fs:self.delete_uploaded_files(
> fs)]
> self._before_delete = [lambda self:self.delete_uploaded_files()]
> self._after_insert = []
> self._after_update = []
> self._after_delete = []
>
> You could define a function that stores a timestamp in the cache and add 
> it to the three "_after_" triggers for the relevant tables. Then just check 
> the timestamp to see if it is more recent than the last request. To 
> determine the time of the last request, you could store a timestamp in the 
> session upon each request (so it will be specific to each user).
>
> Anthony
>
> On Wednesday, June 6, 2012 5:30:56 AM UTC-4, Omri Har-Shemesh wrote:
>>
>> Is it possible to watch a table for changes and define a callback 
>> whenever the table is changed?
>> I am trying to implement a mechanism where the client is asking for data 
>> every 5 seconds from the server
>> and the server should only return not a null value whenever the data has 
>> changed. The problem is that
>> the data is quite complex and is inferred from several tables. Therefore, 
>> in order to save processing time,
>> I want to calculate this data and return it only when I know it has 
>> changed.
>>
>> Thanks,
>> Omri
>>
>

[web2py] Re: Watching a table for changes

2012-06-06 Thread Anthony
In trunk, there are now triggers for DAL tables:

self._before_insert = []
self._before_update = [lambda self,fs:self.delete_uploaded_files(fs
)]
self._before_delete = [lambda self:self.delete_uploaded_files()]
self._after_insert = []
self._after_update = []
self._after_delete = []

You could define a function that stores a timestamp in the cache and add it 
to the three "_after_" triggers for the relevant tables. Then just check 
the timestamp to see if it is more recent than the last request. To 
determine the time of the last request, you could store a timestamp in the 
session upon each request (so it will be specific to each user).

Anthony

On Wednesday, June 6, 2012 5:30:56 AM UTC-4, Omri Har-Shemesh wrote:
>
> Is it possible to watch a table for changes and define a callback whenever 
> the table is changed?
> I am trying to implement a mechanism where the client is asking for data 
> every 5 seconds from the server
> and the server should only return not a null value whenever the data has 
> changed. The problem is that
> the data is quite complex and is inferred from several tables. Therefore, 
> in order to save processing time,
> I want to calculate this data and return it only when I know it has 
> changed.
>
> Thanks,
> Omri
>