[google-appengine] Re: Just released: Python SDK 1.2.3
Great job and many thanks! These 3 update are very useful for me: Task Queue support available as google.appengine.api.labs.taskqueue. Django 1.0 support. You must install Django locally on your machine for the SDK but no longer need to upload it to App Engine. Urlfetch supports asynchronous requests. 2009/6/19 Jason (Google) > > A new release of the Python SDK was made available earlier today. In > addition to oft-requested support for Django 1.0 and asynchronous URL > Fetch, this release introduces the experimental Task Queue API, which > allows you to perform offline processing on App Engine by scheduling > bundles of work (tasks) for automatic execution in the background > without having to worry about managing threads or polling. > > You can download the newest SDK directly from the Downloads page at > http://code.google.com/appengine/downloads.html. > > Check out the official blog post, release notes, and documentation for > more on the newest features of the SDK. And, as always, please feel > free to share your questions, suggestions, and comments on the group. > > > http://googleappengine.blogspot.com/2009/06/new-task-queue-api-on-google-app-engine.html > http://code.google.com/p/googleappengine/wiki/SdkReleaseNotes > http://code.google.com/appengine/docs/python/taskqueue/ > > Cheers! > - Jason > > > --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Just released: Python SDK 1.2.3
Thank you very much for your great work! I've tried asynchronous urlfetch. As far as I've tried, it worked well on App Engine. I've gota feeling that async urlfetch doesn't work asynchronously on SDK. SDK is single threaded, so it is understandable. But, if its true, it might be better to mention about this difference on the urlfetch document. http://code.google.com/appengine/docs/python/urlfetch/asynchronousrequests.html I also noticed the second example on this document lacks a line invoking make_fetch_call() method. I guess the 2nd example should be like following: rpcs = [] for url in urls: rpc = urlfetch.create_rpc() rpc.callback = lambda: handle_result(rpc) urlfetch.make_fetch_call(rpc, url) rpcs.append(rpc) Regards, -- Takashi Matsuo On Fri, Jun 19, 2009 at 8:52 AM, Jason (Google) wrote: > > A new release of the Python SDK was made available earlier today. In > addition to oft-requested support for Django 1.0 and asynchronous URL > Fetch, this release introduces the experimental Task Queue API, which > allows you to perform offline processing on App Engine by scheduling > bundles of work (tasks) for automatic execution in the background > without having to worry about managing threads or polling. > > You can download the newest SDK directly from the Downloads page at > http://code.google.com/appengine/downloads.html. > > Check out the official blog post, release notes, and documentation for > more on the newest features of the SDK. And, as always, please feel > free to share your questions, suggestions, and comments on the group. > > http://googleappengine.blogspot.com/2009/06/new-task-queue-api-on-google-app-engine.html > http://code.google.com/p/googleappengine/wiki/SdkReleaseNotes > http://code.google.com/appengine/docs/python/taskqueue/ > > Cheers! > - Jason > > --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Just released: Python SDK 1.2.3
Thank you! These are really great new features. --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Just released: Python SDK 1.2.3
Cool, great job. Very good release ! Could it be possible to have an example of the "bucket_size" usage. How does it work ? Currently, it is not clear for me. Regards On 19 juin, 08:11, cz wrote: > Thank you! > These are really great new features. --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Just released: Python SDK 1.2.3
Excellent! Is there any limits on the 'params' structure in the task queue? Can we (should we!?!) pass around really big data via this, or would it be best stored in memcache (for example) and just the key passed? --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Just released: Python SDK 1.2.3
Barry, I believe your treat each task as a webrequest and at the moment there is a 10K limit ( http://code.google.com/appengine/docs/python/taskqueue/overview.html) on the size of task items. I believe the best course of action is to stash them in memcache (although I am sure you may get instances where it might be removed from memcache) - from what I understand enqueing on to the task queue is a lot faster than storing a temp object in the data store, depending on the reason for you using the queue, persisting the object to the datastore might negate some of its usefulness. I think some experimentation is needed. Paul 2009/6/19 Barry Hunter > > Excellent! > > Is there any limits on the 'params' structure in the task queue? > > Can we (should we!?!) pass around really big data via this, or would > it be best stored in memcache (for example) and just the key passed? > > > > --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Just released: Python SDK 1.2.3
Regarding django support. Is it 1.02 support or just 1.0 support? I'm currently using zip import (which slows things down significantly when app instance is cold). The release notes says that django needs to be installed. But where? Is 0.96 removed? On Jun 19, 11:51 am, Paul Kinlan wrote: > Barry, > I believe your treat each task as a webrequest and at the moment there is a > 10K limit > (http://code.google.com/appengine/docs/python/taskqueue/overview.html) on the > size of task items. I believe the best course of action is to stash them in > memcache (although I am sure you may get instances where it might be removed > from memcache) - from what I understand enqueing on > to the task queue is a lot faster than storing a temp object in the > data store, depending > on the reason for you using the > queue, persisting the object to the datastore might negate some of its > usefulness. > > I think some experimentation is needed. > > Paul > > 2009/6/19 Barry Hunter > > > > > Excellent! > > > Is there any limits on the 'params' structure in the task queue? > > > Can we (should we!?!) pass around really big data via this, or would > > it be best stored in memcache (for example) and just the key passed? --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Just released: Python SDK 1.2.3
Django 1.02 Check this http://code.google.com/intl/fr/appengine/docs/python/tools/libraries.html#Django for version. from google.appengine.dist import use_library use_library('django', '1.0') --- On 19 juin, 13:05, Ubaldo Huerta wrote: > Regarding django support. > > Is it 1.02 support or just 1.0 support? > > I'm currently using zip import (which slows things down significantly > when app instance is cold). The release notes says that django needs > to be installed. But where? Is 0.96 removed? > > On Jun 19, 11:51 am, Paul Kinlan wrote: > > > Barry, > > I believe your treat each task as a webrequest and at the moment there is a > > 10K limit > > (http://code.google.com/appengine/docs/python/taskqueue/overview.html) on > > the > > size of task items. I believe the best course of action is to stash them in > > memcache (although I am sure you may get instances where it might be removed > > from memcache) - from what I understand enqueing on > > to the task queue is a lot faster than storing a temp object in the > > data store, depending > > on the reason for you using the > > queue, persisting the object to the datastore might negate some of its > > usefulness. > > > I think some experimentation is needed. > > > Paul > > > 2009/6/19 Barry Hunter > > > > Excellent! > > > > Is there any limits on the 'params' structure in the task queue? > > > > Can we (should we!?!) pass around really big data via this, or would > > > it be best stored in memcache (for example) and just the key passed? --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Just released: Python SDK 1.2.3
does TaskQueue is the Google solution for long running tasks, or it will be another new API? What about Big File Storage? I heard nothing about it on GIO. Great work guys! On Jun 19, 9:08 am, Sylvain wrote: > Django 1.02 > > Check > thishttp://code.google.com/intl/fr/appengine/docs/python/tools/libraries > for version. > > from google.appengine.dist import use_library > use_library('django', '1.0') > --- > > On 19 juin, 13:05, Ubaldo Huerta wrote: > > > Regarding django support. > > > Is it 1.02 support or just 1.0 support? > > > I'm currently using zip import (which slows things down significantly > > when app instance is cold). The release notes says that django needs > > to be installed. But where? Is 0.96 removed? > > > On Jun 19, 11:51 am, Paul Kinlan wrote: > > > > Barry, > > > I believe your treat each task as a webrequest and at the moment there is > > > a > > > 10K limit > > > (http://code.google.com/appengine/docs/python/taskqueue/overview.html) on > > > the > > > size of task items. I believe the best course of action is to stash them > > > in > > > memcache (although I am sure you may get instances where it might be > > > removed > > > from memcache) - from what I understand enqueing on > > > to the task queue is a lot faster than storing a temp object in the > > > data store, depending > > > on the reason for you using the > > > queue, persisting the object to the datastore might negate some of its > > > usefulness. > > > > I think some experimentation is needed. > > > > Paul > > > > 2009/6/19 Barry Hunter > > > > > Excellent! > > > > > Is there any limits on the 'params' structure in the task queue? > > > > > Can we (should we!?!) pass around really big data via this, or would > > > > it be best stored in memcache (for example) and just the key passed? --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Just released: Python SDK 1.2.3
- from google.appengine.dist import use_library use_library('django', '1.0') import logging, os # Google App Engine imports. from google.appengine.ext.webapp import util # Force Django to reload its settings. from django.conf import settings settings._target = None # Must set this env var *before* importing any part of Django os.environ['DJANGO_SETTINGS_MODULE'] = 'settings' import django.core.handlers.wsgi import django.core.signals import django.db import django.dispatch.dispatcher def log_exception(*args, **kwds): logging.exception('Exception in request:') # Log errors. django.dispatch.dispatcher.connect( log_exception, django.core.signals.got_request_exception) # Unregister the rollback event handler. django.dispatch.dispatcher.disconnect( django.db._rollback_on_exception, django.core.signals.got_request_exception) def main(): # Re-add Django 1.0 archive to the path, if needed. #if django_path not in sys.path: #sys.path.insert(0, django_path) # Create a Django application for WSGI. application = django.core.handlers.wsgi.WSGIHandler() # Run the WSGI CGI handler with that application. util.run_wsgi_app(application) if __name__ == '__main__': main() - On Jun 19, 7:05 pm, Ubaldo Huerta wrote: > Regarding django support. > > Is it 1.02 support or just 1.0 support? > > I'm currently using zip import (which slows things down significantly > when app instance is cold). The release notes says that django needs > to be installed. But where? Is 0.96 removed? > > On Jun 19, 11:51 am, Paul Kinlan wrote: > > > Barry, > > I believe your treat each task as a webrequest and at the moment there is a > > 10K limit > > (http://code.google.com/appengine/docs/python/taskqueue/overview.html) on > > the > > size of task items. I believe the best course of action is to stash them in > > memcache (although I am sure you may get instances where it might be removed > > from memcache) - from what I understand enqueing on > > to the task queue is a lot faster than storing a temp object in the > > data store, depending > > on the reason for you using the > > queue, persisting the object to the datastore might negate some of its > > usefulness. > > > I think some experimentation is needed. > > > Paul > > > 2009/6/19 Barry Hunter > > > > Excellent! > > > > Is there any limits on the 'params' structure in the task queue? > > > > Can we (should we!?!) pass around really big data via this, or would > > > it be best stored in memcache (for example) and just the key passed? --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Just released: Python SDK 1.2.3
Some initial thoughts using the task queue api: 1. It is very easy to create a chain reaction if you don't know what you are doing :P 2. Using the queues with the dev_appservery.py is very nice such that you can test things out and see how things get queued. 3. Would like to see flush queue option (or something) in the production server, as well as to look at the queue. 4. My (horrible) first try at queues with production data spawned a lot of tasks, most of which now I wish I could just remove and start over. 5. It seemed like I generated 10x tasks then what I was expecting, not sure if that is my mistake, but it didn't seem to have this order of magnitude when I tried with development data, so I am not sure if that is my fault or what. 6. Currently my queue is stuck and not progressing, again, not sure if that is my fault or not. Thanks again, the API itself it drop dead simple and fun. -t On Jun 20, 12:38 am, CaiSong wrote: > - > from google.appengine.dist import use_library > use_library('django', '1.0') > import logging, os > # Google App Engine imports. > from google.appengine.ext.webapp import util > > # Force Django to reload its settings. > from django.conf import settings > settings._target = None > > # Must set this env var *before* importing any part of Django > os.environ['DJANGO_SETTINGS_MODULE'] = 'settings' > > import django.core.handlers.wsgi > import django.core.signals > import django.db > import django.dispatch.dispatcher > > def log_exception(*args, **kwds): > logging.exception('Exception in request:') > > # Log errors. > django.dispatch.dispatcher.connect( > log_exception, django.core.signals.got_request_exception) > > # Unregister the rollback event handler. > django.dispatch.dispatcher.disconnect( > django.db._rollback_on_exception, > django.core.signals.got_request_exception) > > def main(): > # Re-add Django 1.0 archive to the path, if needed. > # if django_path not in sys.path: > # sys.path.insert(0, django_path) > > # Create a Django application for WSGI. > application = django.core.handlers.wsgi.WSGIHandler() > > # Run the WSGI CGI handler with that application. > util.run_wsgi_app(application) > > if __name__ == '__main__': > main() > - > > On Jun 19, 7:05 pm, Ubaldo Huerta wrote: > > > Regarding django support. > > > Is it 1.02 support or just 1.0 support? > > > I'm currently using zip import (which slows things down significantly > > when app instance is cold). The release notes says that django needs > > to be installed. But where? Is 0.96 removed? > > > On Jun 19, 11:51 am, Paul Kinlan wrote: > > > > Barry, > > > I believe your treat each task as a webrequest and at the moment there is > > > a > > > 10K limit > > > (http://code.google.com/appengine/docs/python/taskqueue/overview.html) on > > > the > > > size of task items. I believe the best course of action is to stash them > > > in > > > memcache (although I am sure you may get instances where it might be > > > removed > > > from memcache) - from what I understand enqueing on > > > to the task queue is a lot faster than storing a temp object in the > > > data store, depending > > > on the reason for you using the > > > queue, persisting the object to the datastore might negate some of its > > > usefulness. > > > > I think some experimentation is needed. > > > > Paul > > > > 2009/6/19 Barry Hunter > > > > > Excellent! > > > > > Is there any limits on the 'params' structure in the task queue? > > > > > Can we (should we!?!) pass around really big data via this, or would > > > > it be best stored in memcache (for example) and just the key passed? --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Just released: Python SDK 1.2.3
Congratulations! It also works great with web2py. web2py has a built-in cron-like mechanism but it does not work on GAE. Now thanks to the Task Queue API we may be able to port our cron API (or some part of it) to GAE. Massimo On Jun 18, 6:52 pm, "Jason (Google)" wrote: > A new release of the Python SDK was made available earlier today. In > addition to oft-requested support for Django 1.0 and asynchronous URL > Fetch, this release introduces the experimental Task Queue API, which > allows you to perform offline processing on App Engine by scheduling > bundles of work (tasks) for automatic execution in the background > without having to worry about managing threads or polling. > > You can download the newest SDK directly from the Downloads page > athttp://code.google.com/appengine/downloads.html. > > Check out the official blog post, release notes, and documentation for > more on the newest features of the SDK. And, as always, please feel > free to share your questions, suggestions, and comments on the group. > > http://googleappengine.blogspot.com/2009/06/new-task-queue-api-on-goo...http://code.google.com/p/googleappengine/wiki/SdkReleaseNoteshttp://code.google.com/appengine/docs/python/taskqueue/ > > Cheers! > - Jason --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Just released: Python SDK 1.2.3
On Sat, Jun 20, 2009 at 5:00 PM, Thomas wrote: > > Some initial thoughts using the task queue api: > > 1. It is very easy to create a chain reaction if you don't know what > you are doing :P Indeed it is :-P > 2. Using the queues with the dev_appservery.py is very nice such that > you can test things out and see how things get queued. > > 3. Would like to see flush queue option (or something) in the > production server, as well as to look at the queue. We don't have this right now, but will certainly add this eventually. In the mean time I would encourage you to file this on the issue tracker. > 4. My (horrible) first try at queues with production data spawned a > lot of tasks, most of which now I wish I could just remove and start > over. One thing you can do is pause the queue. Another is to push a new version of your app that has a very simpler handler for the URL the tasks are using; that way it will quickly eat through all of the dangling tasks. > 5. It seemed like I generated 10x tasks then what I was expecting, not > sure if that is my mistake, but it didn't seem to have this order of > magnitude when I tried with development data, so I am not sure if that > is my fault or what. > > 6. Currently my queue is stuck and not progressing, again, not sure if > that is my fault or not. > > Thanks again, the API itself it drop dead simple and fun. Glad to hear! --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---