[google-appengine] Re: Using memcache might be free, but it's costing you money with instance pricing! Use instance caches if possible.

2011-09-28 Thread Paolo Casciello
Remember to implement a mutex system in the in-process caching mechanism or 
your instance will fail randomly when the new py2.7 version will be rolled 
out.


-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To view this discussion on the web visit 
https://groups.google.com/d/msg/google-appengine/-/EiSwoa5pffIJ.
To post to this group, send email to google-appengine@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.



[google-appengine] Re: Using memcache might be free, but it's costing you money with instance pricing! Use instance caches if possible.

2011-09-26 Thread Peter Dev
in new pricing model exist something like memcache data transfer
price?

-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.



[google-appengine] Re: Using memcache might be free, but it's costing you money with instance pricing! Use instance caches if possible.

2011-09-09 Thread Tammo Freese
Hi Santiago,


On Sep 8, 9:01 pm, Santiago Lema jacques.l...@gmail.com wrote:
 Since this app basically serves semi-static content (it's updated once
 a day) I used AppEngine's memcache to serve every request.
[...]
 So I just added another level of caching before memcache: a simple
 python dict that stores the data in the instance itself. This
 instantly reduced the number of instances back to 1 (sometimes 2).
 This suddenly makes the new pricing acceptable.

if
1) you know how long the content won't change,
2) the content is publicly accessible, and
3) you have billing enabled,
you should also set the Cache-Control header. Then Google may put your
content in its front-end Cache, and for cache hits on that cache, you
would only be billed traffic and no CPU/instance time. As far as I
know, there are no guarantees whether/when the front-end cache kicks
in, so I would keep instance cache and Memcache.

If you are on Python:
seconds_valid = # Add whatever value is appropriate here
self.response.headers['Cache-Control'] = public, max-age=%d %
seconds_valid

You see the front-end cache working if your log shows 204 No Content
log entries with 0 CPU seconds in your log.
Please let me know whether it worked for you.


Take care,

Tammo

-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.



[google-appengine] Re: Using memcache might be free, but it's costing you money with instance pricing! Use instance caches if possible.

2011-09-09 Thread Santiago Lema
Thanks a lot! I'll definitely try this.

On 9 sep, 03:40, Tammo Freese i...@flockofbirds.net wrote:
 HiSantiago,

 On Sep 8, 9:01 pm,SantiagoLemajacques.l...@gmail.com wrote:

  Since this app basically serves semi-static content (it's updated once
  a day) I used AppEngine's memcache to serve every request.
 [...]
  So I just added another level of caching before memcache: a simple
  python dict that stores the data in the instance itself. This
  instantly reduced the number of instances back to 1 (sometimes 2).
  This suddenly makes the new pricing acceptable.

 if
 1) you know how long the content won't change,
 2) the content is publicly accessible, and
 3) you have billing enabled,
 you should also set the Cache-Control header. Then Google may put your
 content in its front-end Cache, and for cache hits on that cache, you
 would only be billed traffic and no CPU/instance time. As far as I
 know, there are no guarantees whether/when the front-end cache kicks
 in, so I would keep instance cache and Memcache.

 If you are on Python:
 seconds_valid = # Add whatever value is appropriate here
 self.response.headers['Cache-Control'] = public, max-age=%d %
 seconds_valid

 You see the front-end cache working if your log shows 204 No Content
 log entries with 0 CPU seconds in your log.
 Please let me know whether it worked for you.

 Take care,

 Tammo

-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.



[google-appengine] Re: Using memcache might be free, but it's costing you money with instance pricing! Use instance caches if possible.

2011-09-09 Thread Santiago Lema
Tammo, I thought you might like to know I quoted you in my fresh blog
post on this matter:

http://www.smallte.ch/blog-read_fr_34003.html

-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.



[google-appengine] Re: Using memcache might be free, but it's costing you money with instance pricing! Use instance caches if possible.

2011-09-09 Thread Santiago Lema
Ok, I have just tested it and it seems to work beautifully (at least
for all the requests that didn't include a random value!).

-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.



[google-appengine] Re: Using memcache might be free, but it's costing you money with instance pricing! Use instance caches if possible.

2011-09-08 Thread Brandon Thomson
Good suggestion. At least until we have workable multithreadding I have 
become very reluctant to use RPCs unless absolutely necessary. It helps keep 
the instance number down.

-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To view this discussion on the web visit 
https://groups.google.com/d/msg/google-appengine/-/4UBt0EEiOj4J.
To post to this group, send email to google-appengine@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.



Re: [google-appengine] Re: Using memcache might be free, but it's costing you money with instance pricing! Use instance caches if possible.

2011-09-08 Thread Rishi Arora
Actually I thought using RPCs is a good way to achieve parallelism in
python, before python 2.7 comes along.  My app does a lot of URL fetches
from external websites.  This is very inefficient in terms of GAE instance
uptime.  If the external website takes 30 seconds to respond, my instance
stays up for 30 seconds, and can't service any requests.  So my solution has
been to use asynchronous URL fetch using create_rpc.  If I call 10 URL
fetches asynchronously, and each takes 30 seconds to return, then my 30
second GAE instance uptime is now amortized over 10 URL fetches instead of a
single one.  Does my logic sound wrong?  I have yet to implement this, but I
imagine this will lower my costs by allowing me to set a really low value
for MaxNumInstances (like 1 or 2), and by lowering my instance hours,
without sacrificing request latency.

On Thu, Sep 8, 2011 at 2:35 PM, Brandon Thomson gra...@gmail.com wrote:

 Good suggestion. At least until we have workable multithreadding I have
 become very reluctant to use RPCs unless absolutely necessary. It helps keep
 the instance number down.

 --
 You received this message because you are subscribed to the Google Groups
 Google App Engine group.
 To view this discussion on the web visit
 https://groups.google.com/d/msg/google-appengine/-/4UBt0EEiOj4J.

 To post to this group, send email to google-appengine@googlegroups.com.
 To unsubscribe from this group, send email to
 google-appengine+unsubscr...@googlegroups.com.
 For more options, visit this group at
 http://groups.google.com/group/google-appengine?hl=en.


-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.



Re: [google-appengine] Re: Using memcache might be free, but it's costing you money with instance pricing! Use instance caches if possible.

2011-09-08 Thread Greg Darke (Google)
Rishi: This sounds like an excellent idea. Even with concurrent
requests in python 2.7, using the asynchronous apis like this will
make you application perform much better.

On 9 September 2011 06:18, Rishi Arora rishi.ar...@ship-rack.com wrote:
 Actually I thought using RPCs is a good way to achieve parallelism in
 python, before python 2.7 comes along.  My app does a lot of URL fetches
 from external websites.  This is very inefficient in terms of GAE instance
 uptime.  If the external website takes 30 seconds to respond, my instance
 stays up for 30 seconds, and can't service any requests.  So my solution has
 been to use asynchronous URL fetch using create_rpc.  If I call 10 URL
 fetches asynchronously, and each takes 30 seconds to return, then my 30
 second GAE instance uptime is now amortized over 10 URL fetches instead of a
 single one.  Does my logic sound wrong?  I have yet to implement this, but I
 imagine this will lower my costs by allowing me to set a really low value
 for MaxNumInstances (like 1 or 2), and by lowering my instance hours,
 without sacrificing request latency.

 On Thu, Sep 8, 2011 at 2:35 PM, Brandon Thomson gra...@gmail.com wrote:

 Good suggestion. At least until we have workable multithreadding I have
 become very reluctant to use RPCs unless absolutely necessary. It helps keep
 the instance number down.

 --
 You received this message because you are subscribed to the Google Groups
 Google App Engine group.
 To view this discussion on the web visit
 https://groups.google.com/d/msg/google-appengine/-/4UBt0EEiOj4J.
 To post to this group, send email to google-appengine@googlegroups.com.
 To unsubscribe from this group, send email to
 google-appengine+unsubscr...@googlegroups.com.
 For more options, visit this group at
 http://groups.google.com/group/google-appengine?hl=en.

 --
 You received this message because you are subscribed to the Google Groups
 Google App Engine group.
 To post to this group, send email to google-appengine@googlegroups.com.
 To unsubscribe from this group, send email to
 google-appengine+unsubscr...@googlegroups.com.
 For more options, visit this group at
 http://groups.google.com/group/google-appengine?hl=en.


-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.



[google-appengine] Re: Using memcache might be free, but it's costing you money with instance pricing! Use instance caches if possible.

2011-09-08 Thread Greg
Won't you have problems with instances that live around the time you
update content?

When you send the update request, that instance will update it's dict
(and memcache and the datastore). If you have another instance alive
at that time, it won't know about the update and will continue serving
the old one. If your update happens at a predictable time you could
make the cache logic reload from the datastore, but for unpredictable
updates this technique will break if your app scales above one
instance.

-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.



[google-appengine] Re: Using memcache might be free, but it's costing you money with instance pricing! Use instance caches if possible.

2011-09-08 Thread Gerald Tan
Would be nice to have a Memcache.OnEntryUpdatedCallback...

-- 
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To view this discussion on the web visit 
https://groups.google.com/d/msg/google-appengine/-/wvF_Q3Q8p8MJ.
To post to this group, send email to google-appengine@googlegroups.com.
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.