[google-appengine] List Property containing keys - performance question

2009-06-20 Thread Morten Bek Ditlevsen
Hi there,
I have an entity with a list property containing keys:

  favorites = db.ListProperty(db.Key, indexed=False)


I suddenly came to wonder:
If I check if a key is in the list like this:

if thekey in user.favorites:


will that by any chance try and fetch any entities in the user.favorites
list?

I don't think so, but I would like to make sure! :-)

Sincerely,
/morten

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Just released: Python SDK 1.2.3

2009-06-20 Thread CaiSong

-
from google.appengine.dist import use_library
use_library('django', '1.0')
import logging, os
# Google App Engine imports.
from google.appengine.ext.webapp import util

# Force Django to reload its settings.
from django.conf import settings
settings._target = None

# Must set this env var *before* importing any part of Django
os.environ['DJANGO_SETTINGS_MODULE'] = 'settings'

import django.core.handlers.wsgi
import django.core.signals
import django.db
import django.dispatch.dispatcher

def log_exception(*args, **kwds):
logging.exception('Exception in request:')

# Log errors.
django.dispatch.dispatcher.connect(
   log_exception, django.core.signals.got_request_exception)

# Unregister the rollback event handler.
django.dispatch.dispatcher.disconnect(
django.db._rollback_on_exception,
django.core.signals.got_request_exception)

def main():
# Re-add Django 1.0 archive to the path, if needed.
#if django_path not in sys.path:
#sys.path.insert(0, django_path)

# Create a Django application for WSGI.
application = django.core.handlers.wsgi.WSGIHandler()

# Run the WSGI CGI handler with that application.
util.run_wsgi_app(application)

if __name__ == '__main__':
main()
-

On Jun 19, 7:05 pm, Ubaldo Huerta uba...@gmail.com wrote:
 Regarding django support.

 Is it 1.02 support or just 1.0 support?

 I'm currently using zip import (which slows things down significantly
 when app instance is cold). The release notes says that django needs
 to be installed. But where? Is 0.96 removed?

 On Jun 19, 11:51 am, Paul Kinlan paul.kin...@gmail.com wrote:

  Barry,
  I believe your treat each task as a webrequest and at the moment there is a
  10K limit 
  (http://code.google.com/appengine/docs/python/taskqueue/overview.html) on 
  the
  size of task items.  I believe the best course of action is to stash them in
  memcache (although I am sure you may get instances where it might be removed
  from memcache) - from what I understand enqueing on
  to the task queue is a lot faster than storing a temp object in the
  data store, depending
  on the reason for you using the
  queue, persisting the object to the datastore might negate some of its
  usefulness.

  I think some experimentation is needed.

  Paul

  2009/6/19 Barry Hunter barrybhun...@googlemail.com

   Excellent!

   Is there any limits on the 'params' structure in the task queue?

   Can we (should we!?!) pass around really big data via this, or would
   it be best stored in memcache (for example) and just the key passed?

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] List Property containing keys - performance question

2009-06-20 Thread Federico Builes

Morten Bek Ditlevsen writes:
  Hi there,
  I have an entity with a list property containing keys:
 
favorites = db.ListProperty(db.Key, indexed=False)
 
  I suddenly came to wonder:
  If I check if a key is in the list like this:
  
  if thekey in user.favorites:
  
  will that by any chance try and fetch any entities in the user.favorites
  list?
  
  I don't think so, but I would like to make sure! :-)

When you do foo in bar it's actually calling Python methods, not the datastore 
ops., and since
Python sees favorites as a list of keys it should not fetch the entities.

If you were to do index this and do it in datastore side (WHERE favorites = 
thekey) it might have to
un-marshal the property and do a normal lookup, but I don't think the 
slowdown is noticeable.

-- 
Federico

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: List Property containing keys - performance question

2009-06-20 Thread Morten Bek Ditlevsen
Hi Federico,

Thanks for your answers - I'm just having a bit of a hard time figuring out
which data store requests happen automatically.

I wondered because I had an error in the datastore:

  File /base/data/home/apps/grindrservr/26.334331202299577521/main.py,
line 413, in query
if result in meStatus.blocks:
  File /base/python_lib/versions/1/google/appengine/api/datastore_types.py,
line 472, in __cmp__
for elem in other.__reference.path().element_list():

The 'blocks' property is just like the 'favorites' described in my previous
mail - and 'result' is a value iterated over the results from a 'keys only'
query.

So I guess what I don't understand is why the datastore is in play here. I
know that my results is probably an iterator, but why is this necessary when
you just query for keys?
That's what caused be to think that the error might be related to the
'blocks' list of keys...

Sincerely,
/morten


On Sat, Jun 20, 2009 at 10:22 AM, Federico Builes federico.bui...@gmail.com
 wrote:


 Morten Bek Ditlevsen writes:
   Hi there,
   I have an entity with a list property containing keys:
  
 favorites = db.ListProperty(db.Key, indexed=False)
  
   I suddenly came to wonder:
   If I check if a key is in the list like this:
  
   if thekey in user.favorites:
  
   will that by any chance try and fetch any entities in the user.favorites
   list?
  
   I don't think so, but I would like to make sure! :-)

 When you do foo in bar it's actually calling Python methods, not the
 datastore ops., and since
 Python sees favorites as a list of keys it should not fetch the entities.

 If you were to do index this and do it in datastore side (WHERE favorites
 = thekey) it might have to
 un-marshal the property and do a normal lookup, but I don't think the
 slowdown is noticeable.

 --
 Federico

 


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: UnacceptableVersionError when trying to import Django 1.0

2009-06-20 Thread gnz

After updating to 1.2.3 and modifying the code to use django 1.0 (I
was using zipimport to use 1.0 before), I occassionally get this
exception. Once I get it, the only way to make it go away is to
restart the dev server. Could it be that making changes while dev
server is running is triggering some sort of reload that ignores the
use_library stuff?

gonzalo

On Jun 20, 6:48 am, CaiSong xs23...@gmail.com wrote:
 1. App Engine version 1.2.3
 2. you  Local install Django 1.0.2
 3. add the following lines to the beginning of your script handler
 (main.py):
 from google.appengine.dist import use_library
 use_library('django', '1.0')
 4. setting mush be comment following lines
 #    'django.contrib.sessions.middleware.SessionMiddleware',
 #    'django.contrib.auth.middleware.AuthenticationMiddleware',

 #    'django.contrib.auth',
 #    'django.contrib.sessions',

 settings.py e.g:
 ___
 # Django settings for src project.
 import os
 DEBUG = True
 TEMPLATE_DEBUG = DEBUG

 BASE_PATH = os.path.dirname(__file__)

 ADMINS = (
     # ('Your Name', 'your_em...@domain.com'),
 )

 MANAGERS = ADMINS

 DATABASE_ENGINE = ''           # 'postgresql_psycopg2', 'postgresql',
 'mysql', 'sqlite3' or 'oracle'.
 DATABASE_NAME = ''             # Or path to database file if using
 sqlite3.
 DATABASE_USER = ''             # Not used with sqlite3.
 DATABASE_PASSWORD = ''         # Not used with sqlite3.
 DATABASE_HOST = ''             # Set to empty string for localhost.
 Not used with sqlite3.
 DATABASE_PORT = ''             # Set to empty string for default. Not
 used with sqlite3.

 # Local time zone for this installation. Choices can be found here:
 #http://en.wikipedia.org/wiki/List_of_tz_zones_by_name
 # although not all choices may be available on all operating systems.
 # If running in a Windows environment this must be set to the same as
 your
 # system time zone.
 TIME_ZONE = 'America/Chicago'

 # Language code for this installation. All choices can be found here:
 #http://www.i18nguy.com/unicode/language-identifiers.html
 LANGUAGE_CODE = 'en-us'

 SITE_ID = 1

 # If you set this to False, Django will make some optimizations so as
 not
 # to load the internationalization machinery.
 USE_I18N = True

 # Absolute path to the directory that holds media.
 # Example: /home/media/media.lawrence.com/
 MEDIA_ROOT = ''

 # URL that handles the media served from MEDIA_ROOT. Make sure to use
 a
 # trailing slash if there is a path component (optional in other
 cases).
 # Examples: http://media.lawrence.com;, http://example.com/media/;
 MEDIA_URL = ''

 # URL prefix for admin media -- CSS, JavaScript and images. Make sure
 to use a
 # trailing slash.
 # Examples: http://foo.com/media/;, /media/.
 ADMIN_MEDIA_PREFIX = '/media/'

 # Make this unique, and don't share it with anybody.
 SECRET_KEY = 'p7x1he4xp117$h3r#l!*ra9wc5d=o(r...@p5x^14^6)mmj_t28w)'

 # List of callables that know how to import templates from various
 sources.
 TEMPLATE_LOADERS = (
     'django.template.loaders.filesystem.load_template_source',
     'django.template.loaders.app_directories.load_template_source',
 #     'django.template.loaders.eggs.load_template_source',
 )

 MIDDLEWARE_CLASSES = (
     'django.middleware.common.CommonMiddleware',
 #    'django.contrib.sessions.middleware.SessionMiddleware',
 #    'django.contrib.auth.middleware.AuthenticationMiddleware',
 )

 ROOT_URLCONF = 'urls'

 TEMPLATE_DIRS = (
     # Put strings here, like /home/html/django_templates or C:/www/
 django/templates.
     # Always use forward slashes, even on Windows.
     # Don't forget to use absolute paths, not relative paths.
     os.path.join(BASE_PATH,'templates'),
 )

 INSTALLED_APPS = (
 #    'django.contrib.auth',
     'django.contrib.contenttypes',
 #    'django.contrib.sessions',
     'django.contrib.sites',
 )

 ___
 main.py e.g:
 ___
 from google.appengine.dist import use_library
 use_library('django', '1.0')
 import logging, os
 # Google App Engine imports.
 from google.appengine.ext.webapp import util

 # Force Django to reload its settings.
 from django.conf import settings
 settings._target = None

 # Must set this env var *before* importing any part of Django
 os.environ['DJANGO_SETTINGS_MODULE'] = 'settings'

 import django.core.handlers.wsgi
 import django.core.signals
 import django.db
 import django.dispatch.dispatcher

 def log_exception(*args, **kwds):
     logging.exception('Exception in request:')

     # Log errors.
     django.dispatch.dispatcher.connect(
        log_exception, django.core.signals.got_request_exception)

     # Unregister the rollback event handler.
     django.dispatch.dispatcher.disconnect(
         django.db._rollback_on_exception,
         django.core.signals.got_request_exception)

 def main():
     # Re-add Django 1.0 archive to the path, if needed.
 #  

[google-appengine] Re: 30 second request limit - a killer?

2009-06-20 Thread Charlie

Your link to the Compass forum is broken for  me. Not that I know
anything really about Compass, anyways.

You will have to figure out a way to rebuild the indexes
incrementally.

I would suggest looking at the new Task Queue api. It seems
appropriate for something like this -- rebuild a bit of the indexes,
then post a new task to rebuild some more, etc.

On Jun 19, 10:57 pm, Dominik Steiner
dominik.j.stei...@googlemail.com wrote:
 Hi there,

 I have made my first steps with GAE on Java and it had been a pleasure
 to develop with the eclipse plugin for GWT and GAE. As the JDO query
 implementation of GAE is quite reduced, I used the Compass framework
 to work around that and it looked like it could get my app going.

 But as you can read in the following forum post

 http://forum.compass-project.org/thread.jspa?messageID=298249贉

 I have run into problems that my data in the GAE database and the
 Compass cache is running out of sync. The solution from Compass side
 to trigger an indexing of the Compass cache is failing because that
 operation is taking more than 30 seconds and thus is throwing an
 error.

 So my questions are: have others run into the same problem and could
 fix it? what would be a workaround of the 30 second limit?

 I really would love to see my app running on GAE, but right now that
 problem is killing it.

 Anybody with some hints or ideas?

 Thanks

 Dominik
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Tuples, Exploding Index, IO talk

2009-06-20 Thread hawkett

Hi,

   I was watching Brett's IO talk re. using 'Relational Index Tables',
and there were a few hints of things in there, and I just wanted to
check I got it all correctly -

1.  Lists are good for tuples - a use case I see is an entity being
tagged, and having a state within that tag - so the tuples might be
('tagA', 'PENDING') , ('tagB', 'ACCEPTED'), ('tagC', 'DENIED') etc. -
so the list structures would be

class Thing(db.Model):
  name = db.StringProperty()
  tags = db.ListProperty(str, default=[])
  states = db.ListProperty(str, default=[])

with their contents tags = ['tagA', 'tagB', 'tagC'], states =
['PENDING', 'ACCEPTED', 'DENIED']

and as data comes and goes you maintain both lists to ensure you
record the correct state for the correct tag by matching their list
position.

2.  Relational Index Tables are good for exploding index problems - so
the query here might be -
get me all the 'Things' which have 'tagA' and which are 'PENDING' in
that tag - i.e. all records with the tuple ('tagA, 'PENDING'), which
would be a composite index over two list properties - an exploding
index.

So assuming I've got the above right, I'm trying to work out a few
things

a.  Without relational index tables, what is the best way to construct
the query - e.g.

things = db.GqlQuery(
  SELECT * FROM Thing 
  WHERE tags = :1 AND states = :2, 'tagA', 'PENDING')

which would get me anything that had 'tagA' at any point in the tags
list, and anything that had a 'PENDING' at any point in the states
list. This is potentially many more records than those that match the
tuple.  So then I have to do an in-memory cull of those records
returned and work out which ones actually conform to the tuple?  Just
wondering if I am missing something here, because it seems like a
great method for storing a tuple, but complex to query for that same
tuple?

b.  If I am going to use relational index tables, to avoid the
exploding index that the above query could generate -

class Thing(db.model):
  name = db.StringProperty()

class ThingTagIndex(db.Model)
  tags = db.ListProperty(str, default=[])

class ThingStateIndex(db.Model)
  states = db.ListProperty(str, default=[])

then am I right in thinking that my query would be performed as

tagIndexKeys = db.GqlQuery(
  SELECT __key__ FROM ThingTagIndex 
  WHERE tags = :1, 'tagA')

# All the things that have 'tagA' in their tags list
thingTagKeys = [k.parent() for k in tagIndexKeys]

stateIndexKeys = db.GqlQuery(
  SELECT __key__ FROM ThingStateIndex 
  WHERE states = :1 AND ANCESTOR IN :2, 'PENDING', thingTagKeys)

# All the things that have both 'tagA' and 'PENDING' (but not
necessarily as a tuple)
thingKeys = [k.parent() for k in stateIndexKeys]

things = db.get(thingKeys)

# Oops - I need the lists to do the culling part of my tuple query
from (a)

So I have avoided the exploding index by performing two separate
queries, but I could have achieved much the same result without the
index tables - i.e. by performing separate queries and avoiding the
composite index.  Just wondering if I am seeing the tuple situation
correctly - i.e. there is no way to query them that doesn't require
some in-memory culling?  Thanks,

Colin

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: No way to delete error data entry

2009-06-20 Thread Charlie Zhu

Thank you, Nick,

I have written code as below with low level API to delete the entry.
It runs without error but seems not totally working. And thanks god
that data suddenly appeared at Data Viewer and problem resolved.

Code pasted here and hope it useful for others

import com.google.appengine.api.datastore.DatastoreService;
import com.google.appengine.api.datastore.DatastoreServiceFactory;
import com.google.appengine.api.datastore.Entity;
import com.google.appengine.api.datastore.Query;

public void doGet(HttpServletRequest req, HttpServletResponse resp)
throws IOException {
 String tbname = req.getParameter(tbname);
 if(tbname!=null)
 {
 DatastoreService datastore =
DatastoreServiceFactory.getDatastoreService();

 // Or perform a query
 Query query = new Query(tbname);
 for (Entity taskEntity : 
datastore.prepare(query).asIterable()) {
 datastore.delete(taskEntity.getKey());
 }
 }
}


Regards,
Charlie

On Jun 17, 11:58 pm, Nick Johnson (Google) nick.john...@google.com
wrote:
 Hi Charlie,

 Your easiest option here is probably to upload an alternate major version of
 your app with the old schema, and use that to retrieve and fix the faulty
 entit(y|ies). Alternate approaches include using the low level datastore
 API, or uploading a Python version that uses the low level API or
 db.Expando.

 -Nick Johnson





 On Wed, Jun 17, 2009 at 9:15 AM, Charlie Zhu zh.char...@gmail.com wrote:

  Hi,

  I have tried all ways I known to delete some schema changing caused
  error Entities and failed.

  1. Delete on Data Viewer on the console.
  Data Viewer shows No Data Yet.

  2. Delete by code
  Below is part of the codes:
         Query q = pm.newQuery(CDKFingerprint.class);
         ListCDKFingerprint results2;
         results2 = (ListCDKFingerprint) q.execute();
         pm.deletePersistentAll(results2);
  But that cause server error:
  java.lang.NullPointerException: Datastore entity with kind
  CDKFingerprint and key CDKMol(c=cc=cc=c)/CDKFingerprint(1) has a null
  property named bits_count.  This property is mapped to
  cdkhelper.CDKFingerprint.bits_count, which cannot accept null values.
  ...
  at org.datanucleus.jdo.JDOPersistenceManager.deletePersistentAll
  (JDOPersistenceManager.java:795)
  ...

  3. Assign values to the NULL field then delete
  The code
         for(CDKFingerprint r: results2) {
                 r.bits_count = 0;
                 pm.makePersistent(r);
         }
  And server error again
  java.lang.NullPointerException: Datastore entity with kind
  CDKFingerprint and key CDKMol(c=cc=cc=c)/CDKFingerprint(1) has a null
  property named bits_count.  This property is mapped to
  cdkhelper.CDKFingerprint.bits_count, which cannot accept null values.
  ...
  at org.datanucleus.store.appengine.query.StreamingQueryResult
  $AbstractListIterator.hasNext(StreamingQueryResult.java:205)
  ...

  Having no idea and hoping help.

  Regards,
  Charlie

 --
 Nick Johnson, App Engine Developer Programs Engineer
 Google Ireland Ltd. :: Registered in Dublin, Ireland, Registration Number:
 368047
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: 30 second request limit - a killer?

2009-06-20 Thread Dominik Steiner

Thanks Charlie for the fast reply,

here is the link to the Compass forum post

http://forum.compass-project.org/forum.jspa?forumID=37

I will have a look in the Task Queue Api and write back to that forum
if it helped.

Dominik

P.S. just to clarify, with Task Queue you mean the java.util.TaskQueue
class?

On 20 Jun., 05:08, Charlie charlieev...@mac.com wrote:
 Your link to the Compass forum is broken for  me. Not that I know
 anything really about Compass, anyways.

 You will have to figure out a way to rebuild the indexes
 incrementally.

 I would suggest looking at the new Task Queue api. It seems
 appropriate for something like this -- rebuild a bit of the indexes,
 then post a new task to rebuild some more, etc.

 On Jun 19, 10:57 pm, Dominik Steiner

 dominik.j.stei...@googlemail.com wrote:
  Hi there,

  I have made my first steps with GAE on Java and it had been a pleasure
  to develop with the eclipse plugin for GWT and GAE. As the JDO query
  implementation of GAE is quite reduced, I used the Compass framework
  to work around that and it looked like it could get my app going.

  But as you can read in the following forum post

 http://forum.compass-project.org/thread.jspa?messageID=298249贉

  I have run into problems that my data in the GAE database and the
  Compass cache is running out of sync. The solution from Compass side
  to trigger an indexing of the Compass cache is failing because that
  operation is taking more than 30 seconds and thus is throwing an
  error.

  So my questions are: have others run into the same problem and could
  fix it? what would be a workaround of the 30 second limit?

  I really would love to see my app running on GAE, but right now that
  problem is killing it.

  Anybody with some hints or ideas?

  Thanks

  Dominik
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] SMS verification trouble.

2009-06-20 Thread Patipat Susumpow
Hi,

I can't verify my account by SMS from
http://appengine.google.com/permissions/smssend.do, tried many times with
friends' mobile phone no, various supported operators in Thailand, but
always get The phone number has been sent too many messages or has already
been used to confirm an account. message.

I'm wondering that I never use this verification method before, but get this
error.

Thanks,
Patipat.

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Vacuum Indexes - Datastore Indices Count

2009-06-20 Thread Phil

Hi,

I've come across the issue regardng vacuuming of indexes not correctly
releasing resources creating the following exception: Your
application is exceeding a quota: Datastore Indices Count

Can someone please reset the quota on my application 5starlivesbeta.

Also, I found it was much quicker removing all indexes before clearing
down my test data (which I did with vacuum indexes) and then recreated
indexes once the datastore was empty - is this the recommended
approach?

Cheers!

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Just released: Python SDK 1.2.3

2009-06-20 Thread Thomas

Some initial thoughts using the task queue api:

1. It is very easy to create a chain reaction if you don't know what
you are doing :P

2. Using the queues with the dev_appservery.py is very nice such that
you can test things out and see how things get queued.

3. Would like to see flush queue option (or something) in the
production server, as well as to look at the queue.

4. My (horrible) first try at queues with production data spawned a
lot of tasks, most of which now I wish I could just remove and start
over.

5. It seemed like I generated 10x tasks then what I was expecting, not
sure if that is my mistake, but it didn't seem to have this order of
magnitude when I tried with development data, so I am not sure if that
is my fault or what.

6. Currently my queue is stuck and not progressing, again, not sure if
that is my fault or not.

Thanks again, the API itself it drop dead simple and fun.

-t

On Jun 20, 12:38 am, CaiSong xs23...@gmail.com wrote:
 -
 from google.appengine.dist import use_library
 use_library('django', '1.0')
 import logging, os
 # Google App Engine imports.
 from google.appengine.ext.webapp import util

 # Force Django to reload its settings.
 from django.conf import settings
 settings._target = None

 # Must set this env var *before* importing any part of Django
 os.environ['DJANGO_SETTINGS_MODULE'] = 'settings'

 import django.core.handlers.wsgi
 import django.core.signals
 import django.db
 import django.dispatch.dispatcher

 def log_exception(*args, **kwds):
     logging.exception('Exception in request:')

     # Log errors.
     django.dispatch.dispatcher.connect(
        log_exception, django.core.signals.got_request_exception)

     # Unregister the rollback event handler.
     django.dispatch.dispatcher.disconnect(
         django.db._rollback_on_exception,
         django.core.signals.got_request_exception)

 def main():
     # Re-add Django 1.0 archive to the path, if needed.
 #    if django_path not in sys.path:
 #        sys.path.insert(0, django_path)

     # Create a Django application for WSGI.
     application = django.core.handlers.wsgi.WSGIHandler()

     # Run the WSGI CGI handler with that application.
     util.run_wsgi_app(application)

 if __name__ == '__main__':
     main()
 -

 On Jun 19, 7:05 pm, Ubaldo Huerta uba...@gmail.com wrote:

  Regarding django support.

  Is it 1.02 support or just 1.0 support?

  I'm currently using zip import (which slows things down significantly
  when app instance is cold). The release notes says that django needs
  to be installed. But where? Is 0.96 removed?

  On Jun 19, 11:51 am, Paul Kinlan paul.kin...@gmail.com wrote:

   Barry,
   I believe your treat each task as a webrequest and at the moment there is 
   a
   10K limit 
   (http://code.google.com/appengine/docs/python/taskqueue/overview.html) on 
   the
   size of task items.  I believe the best course of action is to stash them 
   in
   memcache (although I am sure you may get instances where it might be 
   removed
   from memcache) - from what I understand enqueing on
   to the task queue is a lot faster than storing a temp object in the
   data store, depending
   on the reason for you using the
   queue, persisting the object to the datastore might negate some of its
   usefulness.

   I think some experimentation is needed.

   Paul

   2009/6/19 Barry Hunter barrybhun...@googlemail.com

Excellent!

Is there any limits on the 'params' structure in the task queue?

Can we (should we!?!) pass around really big data via this, or would
it be best stored in memcache (for example) and just the key passed?

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Vacuum Indexes - Datastore Indices Count

2009-06-20 Thread Jeff S (Google)

Hi Phil,

Apologies for the inconvenience. I've reset the index count for your
app. The speedup you saw from creating indexes on an empty datastore
is expected.

Happy coding,

Jeff

On Jun 20, 6:16 am, Phil phil.pet...@taters.co.uk wrote:
 Hi,

 I've come across the issue regardng vacuuming of indexes not correctly
 releasing resources creating the following exception: Your
 application is exceeding a quota: Datastore Indices Count

 Can someone please reset the quota on my application 5starlivesbeta.

 Also, I found it was much quicker removing all indexes before clearing
 down my test data (which I did with vacuum indexes) and then recreated
 indexes once the datastore was empty - is this the recommended
 approach?

 Cheers!
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] List Property containing keys - performance question

2009-06-20 Thread Federico Builes

Morten Bek Ditlevsen writes:
  Thanks for your answers - I'm just having a bit of a hard time figuring out
  which data store requests happen automatically.
  
  I wondered because I had an error in the datastore:
  
File /base/data/home/apps/grindrservr/26.334331202299577521/main.py,
  line 413, in query
  if result in meStatus.blocks:
File /base/python_lib/versions/1/google/appengine/api/datastore_types.py,
  line 472, in __cmp__
  for elem in other.__reference.path().element_list():
  
  The 'blocks' property is just like the 'favorites' described in my previous
  mail - and 'result' is a value iterated over the results from a 'keys only'
  query.
  
  So I guess what I don't understand is why the datastore is in play here. I
  know that my results is probably an iterator, but why is this necessary when
  you just query for keys?
  That's what caused be to think that the error might be related to the
  'blocks' list of keys...

First of all, take my answer with a bit of salt since I'm new here too :)

I think the datastore_types.py is not open source (or I can't find it in
http://code.google.com/p/googleappengine) but since path().element_list() 
returns the parents/childs
for an entity, that line should just be doing that for 'other', so maybe it's 
just unmarshalling the
results or something, I don't think it's fetching a whole entity. Why don't you 
paste the full error
trace so we can help you better?

-- 
Federico Builes

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Model design for wiki like index pages

2009-06-20 Thread Jesse Grosjean

I have a wiki like app.

The basic model is a page which has title(StringProperty) and a body
(TextProperty) properties. It all seems to work well, but I'm not sure
how well my model will scale. The problem I see is that I want to have
an Index page, which lists all other pages.

My concern is when a model object is loaded in GAE all property fields
are also loaded in from the store at the same time. That would seem to
post a problem with my app on index pages, because it would mean when
someone visits the index page both the title (which I want) and body
(which I don't need) for all pages would need to be loaded from the
store. Loading the body in this case seems wasteful, and possibly very
problematic performance wise on a site with many pages.

My questions:

1. Is this a problem that other people are worrying about, should I
worry about it? I could solve the problem by dividing my page model
into two separate models... on that contained the title and a
reference to another model which would contain page body. That should
make the index page scale, but it complicates the rest of the app. I'd
prefer to avoid that rout if possible.

2. Is there, or is there a future possibility to specify that certain
fields in a model are lazy load, not fetched and returned in the
initial query?

Thanks,
Jesse
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: 30 second request limit - a killer?

2009-06-20 Thread gadgster

I think Charlie is referring to the newly released Task Queue API:

http://googleappengine.blogspot.com/2009/06/new-task-queue-api-on-google-app-engine.html

Unfortunately for you, this is Python only at the moment. I would
think you will see it in the Java version soon.


On Jun 20, 3:56 pm, Dominik Steiner dominik.j.stei...@googlemail.com
wrote:
 Thanks Charlie for the fast reply,

 here is the link to the Compass forum post

 http://forum.compass-project.org/forum.jspa?forumID=37

 I will have a look in the Task Queue Api and write back to that forum
 if it helped.

 Dominik

 P.S. just to clarify, with Task Queue you mean the java.util.TaskQueue
 class?

 On 20 Jun., 05:08, Charlie charlieev...@mac.com wrote:



  Your link to the Compass forum is broken for  me. Not that I know
  anything really about Compass, anyways.

  You will have to figure out a way to rebuild the indexes
  incrementally.

  I would suggest looking at the new Task Queue api. It seems
  appropriate for something like this -- rebuild a bit of the indexes,
  then post a new task to rebuild some more, etc.

  On Jun 19, 10:57 pm, Dominik Steiner

  dominik.j.stei...@googlemail.com wrote:
   Hi there,

   I have made my first steps with GAE on Java and it had been a pleasure
   to develop with the eclipse plugin for GWT and GAE. As the JDO query
   implementation of GAE is quite reduced, I used the Compass framework
   to work around that and it looked like it could get my app going.

   But as you can read in the following forum post

  http://forum.compass-project.org/thread.jspa?messageID=298249贉

   I have run into problems that my data in the GAE database and the
   Compass cache is running out of sync. The solution from Compass side
   to trigger an indexing of the Compass cache is failing because that
   operation is taking more than 30 seconds and thus is throwing an
   error.

   So my questions are: have others run into the same problem and could
   fix it? what would be a workaround of the 30 second limit?

   I really would love to see my app running on GAE, but right now that
   problem is killing it.

   Anybody with some hints or ideas?

   Thanks

   Dominik
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Model design for wiki like index pages

2009-06-20 Thread Takashi Matsuo

Hi Jesse,

One thing that comes to mind first is to use key_name for that purpose.
Perhaps you can use pagenames as key_name when storing pages, you can
use keys_only query for rendering the Index page.

Just my 2 yen.

-- Takashi Matsuo



On Sun, Jun 21, 2009 at 6:49 AM, Jesse Grosjeanje...@hogbaysoftware.com wrote:

 I have a wiki like app.

 The basic model is a page which has title(StringProperty) and a body
 (TextProperty) properties. It all seems to work well, but I'm not sure
 how well my model will scale. The problem I see is that I want to have
 an Index page, which lists all other pages.

 My concern is when a model object is loaded in GAE all property fields
 are also loaded in from the store at the same time. That would seem to
 post a problem with my app on index pages, because it would mean when
 someone visits the index page both the title (which I want) and body
 (which I don't need) for all pages would need to be loaded from the
 store. Loading the body in this case seems wasteful, and possibly very
 problematic performance wise on a site with many pages.

 My questions:

 1. Is this a problem that other people are worrying about, should I
 worry about it? I could solve the problem by dividing my page model
 into two separate models... on that contained the title and a
 reference to another model which would contain page body. That should
 make the index page scale, but it complicates the rest of the app. I'd
 prefer to avoid that rout if possible.

 2. Is there, or is there a future possibility to specify that certain
 fields in a model are lazy load, not fetched and returned in the
 initial query?

 Thanks,
 Jesse
 


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Re: Model design for wiki like index pages

2009-06-20 Thread Tim Hoffman

Hi

Use memcache.  Have the index page retrieve everything it needs then
only stick summary data in memcache, then be agressive in serving out
data from memcache

On Updating/Adding individual pages flush their bit of cache.
Key the memcache entries on the actual entity key or a path(what ever
is appropriate)

The as mentioned elsewhere establish a mechanism for doing key only
queries
http://groups.google.com.au/group/google-appengine/browse_thread/thread/0e77738adc609864#

The retrieve summary info from memcache only fetching a full entity
when it is missing from the cache (and push it's details into the
cache).

You could even run a cron task to keep the cache loaded.

Rgds

T



On Jun 21, 5:49 am, Jesse Grosjean je...@hogbaysoftware.com wrote:
 I have a wiki like app.

 The basic model is a page which has title(StringProperty) and a body
 (TextProperty) properties. It all seems to work well, but I'm not sure
 how well my model will scale. The problem I see is that I want to have
 an Index page, which lists all other pages.

 My concern is when a model object is loaded in GAE all property fields
 are also loaded in from the store at the same time. That would seem to
 post a problem with my app on index pages, because it would mean when
 someone visits the index page both the title (which I want) and body
 (which I don't need) for all pages would need to be loaded from the
 store. Loading the body in this case seems wasteful, and possibly very
 problematic performance wise on a site with many pages.

 My questions:

 1. Is this a problem that other people are worrying about, should I
 worry about it? I could solve the problem by dividing my page model
 into two separate models... on that contained the title and a
 reference to another model which would contain page body. That should
 make the index page scale, but it complicates the rest of the app. I'd
 prefer to avoid that rout if possible.

 2. Is there, or is there a future possibility to specify that certain
 fields in a model are lazy load, not fetched and returned in the
 initial query?

 Thanks,
 Jesse
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---



[google-appengine] Cheetah

2009-06-20 Thread mobil

would it be possible to support Cheetah on Google app engine

the non c version of cheetah is very slow on app engine
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Google App Engine group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to 
google-appengine+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~--~~~~--~~--~--~---