Hi Richard,

Consider using NDB, and its coroutine-based parallelism. It's pretty much
exactly built for this situation: if you have multiple coroutines, each one
will be run until it attempts to wait on an RPC (such as a datastore or
memcache operation), then the requests will all be automatically batched
together and executed as efficiently as possible. Thus, you get the
benefits of get_multi, without the complexity of restructuring your code
for it.

-Nick Johnson

On Wed, Nov 23, 2011 at 7:18 AM, Richard Arrano <rickarr...@gmail.com>wrote:

> Hello,
> Quick question regarding multithreading in Python 2.7:
> I have some requests that call 2-3 functions that call the memcache in
> each function. It would be possible but quite complicated to just use
> get_multi, and I was wondering if I could simply put each function
> into a thread and run the 2-3 threads to achieve some parallelism.
> Would this work or am I misunderstood about what we can and cannot do
> with regards to multithreading in 2.7?
>
> Thanks,
> Richard
>
> --
> You received this message because you are subscribed to the Google Groups
> "Google App Engine" group.
> To post to this group, send email to google-appengine@googlegroups.com.
> To unsubscribe from this group, send email to
> google-appengine+unsubscr...@googlegroups.com.
> For more options, visit this group at
> http://groups.google.com/group/google-appengine?hl=en.
>
>


-- 
Nick Johnson, Developer Programs Engineer, App Engine

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.

Reply via email to