[google-appengine] Re: how to write the output of a django template into a frame?
template.render returns you the contents of the template, and you are trying to provide that to the src attribute. The src attribute should be the LOCATION, not the content. See http://www.w3schools.com/TAGS/tag_iframe.asp Should be something like: self.response.out.write('') (and then your 'myframe' url returns the contents you currently have in x). On Feb 25, 7:49 pm, sjh wrote: > Hi > > I am having a nightmare of a time trying to figure out how to get the > google chart api to work with GAE. I can use a template to generate my > chart but I need to put that output into a a frame and I am stuck. > Here is the call: > > def post(self): > mytitle ='"Hello World"' > mydata='t:40,20,50,20,100|10,10,10,10,10' > template_values = {'mytitle': mytitle, > 'mydata': mydata} > > path = os.path.join(os.path.dirname(__file__), "index.html") > x=template.render(path, template_values) > self.response.out.write(' height="400">') > > X -- marks where I am stuck > > Here is the template (index.html) > > http://www.w3.org/1999/xhtml";> > > > > // Send the POST when the page is loaded, > // which will replace this whole page with the retrieved chart. > function loadGraph() { > var frm = document.getElementById('post_form'); > if (frm) { > frm.submit(); > } > } > > > > id='post_form'> > > > > > > > > > > Any suggestions would be much appreciated. > > thanks, Simon -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appeng...@googlegroups.com. To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en.
[google-appengine] Re: Issue with app deployment?
Same here, at least I've seen the 'Version not ready' error after it went up to waiting 128 seconds between checks. Glad it's not just me. The system doesn't seem to be completely read-only (the existing version can still write). On Feb 16, 10:20 pm, gwstuff wrote: > Hi, > > I'm having trouble updating my apps. Anyone else facing this problem? > > At first, I got error #1. Now I'm getting error #2. I only have 5 > versions in this app. > > #1: > > This is the error log reported by appcfg.py. > > Will check again in 64 seconds. > Checking if new version is ready to serve. > Will check again in 128 seconds. > Checking if new version is ready to serve. > 2010-02-16 22:13:43,422 WARNING appcfg.py:1335 Version still not ready > to serve, aborting. > 2010-02-16 22:13:43,523 ERROR appcfg.py:1471 An unexpected error > occurred. Aborting. > Traceback (most recent call last): > File "/Applications/GoogleAppEngineLauncher.app/Contents/Resources/ > GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/ > google/appengine/tools/appcfg.py", line 1460, in DoUpload > self.Commit() > File "/Applications/GoogleAppEngineLauncher.app/Contents/Resources/ > GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/ > google/appengine/tools/appcfg.py", line 1336, in Commit > raise Exception('Version not ready.') > Exception: Version not ready. > Rolling back the update. > Traceback (most recent call last): > File "/usr/local/bin/appcfg.py", line 68, in > run_file(__file__, globals()) > File "/usr/local/bin/appcfg.py", line 64, in run_file > execfile(script_path, globals_) > > #2: > > /usr/local/bin/appcfg.py:41: DeprecationWarning: the sha module is > deprecated; use the hashlib module instead > os.path.join(DIR_PATH, 'lib', 'antlr3'), > /Applications/GoogleAppEngineLauncher.app/Contents/Resources/ > GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/ > google/appengine/tools/dev_appserver_login.py:33: DeprecationWarning: > the md5 module is deprecated; use hashlib instead > import md5 > Application: gwtodo; version: iconshootout. > Server: appengine.google.com. > Scanning files on local disk. > Initiating update. > Cloning 16 static files. > Cloning 57 application files. > Deploying new version. > Rolling back the update. > Error 403: --- begin server output --- > > Too Many Versions (403) > The application already has the maximum number of versions. > --- end server output --- -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appeng...@googlegroups.com. To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en.
[google-appengine] Re: how large data can i use in App Caching and how long will it be cached?
In Python at least, GAE looks for a function called main() to enable app caching. Simply rename main() to something else. On Feb 13, 6:41 am, Eric Ka Ka Ng wrote: > is it possible to 'disable' the app caching behavior? > > - eric > > On 12 February 2010 17:48, saintthor wrote: > > > see the demo in this page:http://code.google.com/intl/en/appengine/ > > docs/python/runtime.html#App_Caching > > > ### mymodule.py > > counter = 0 > > def increment(): > > global counter > > counter += 1 > > return counter > > > ### myhandler.py > > import mymodule > > > print "Content-Type: text/plain" > > print "" > > print "My number: " + str(mymodule.increment()) > > > do you mean if the site has not accessed for some minutes, counter > > will be reset to 0? > > > On 2月12日, 下午4时24分, Tim Hoffman wrote: > >> App caching could last as little as a few minutes if your site is not > >> used. > >> In addition if multiple instances are run then only one instance will > >> have the counter with the correct value. > > >> You should store your obj in the datastore and cache it in memcache. > > >> module level caching is really only useful for cacheable things for > >> each instance, > >> for example compiled templates. > > >> T > > >> On Feb 12, 3:34 pm, saintthor wrote: > > >> > ### mymodule.py > >> > counter = LargeObj() > > >> > ### myhandler.py > >> > import mymodule > > >> > print "Content-Type: text/plain" > >> > print "" > >> > print "My number: " + str(mymodule.counter) > > >> > if sizeof counter is greater than 1M, can it work? > > >> > if there is no request for days, will counter still be cached? > > > -- > > You received this message because you are subscribed to the Google Groups > > "Google App Engine" group. > > To post to this group, send email to google-appeng...@googlegroups.com. > > To unsubscribe from this group, send email to > > google-appengine+unsubscr...@googlegroups.com. > > For more options, visit this group > > athttp://groups.google.com/group/google-appengine?hl=en. > > -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appeng...@googlegroups.com. To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en.
[google-appengine] Re: Transaction across entities in different groups
You're right, it's not entirely clear why Model.get_by_id(ids) requires all entities to have the same parent, while Model.get(keys) doesn't have that requirement. Presumably some implementation detail. But whatever the reason, using Model.get(keys) is a good idea. I'll give that a shot when I get a chance. Thanks for the idea! On Jan 29, 4:06 pm, Danny Tuppeny wrote: > I don't know if I was drunk when I sent this message, but I don't > think it makes any sense at all. If you're calling get() on SS, it's > quite clear what the kind is! > > On Jan 28, 7:20 am, Danny Tuppeny wrote: > > > On Jan 24, 12:29 am, dburns wrote: > > > > Making a User instance the parent of a SS (snapshot) instance seems > > > like a natural fit, except then I can't fetch all the favourites via: > > > favs = SS.get_by_id(user.fav_ids). The reason is that all parents > > > have to be the same to use SS.get_by_id (according > > > tohttp://code.google.com/appengine/docs/python/datastore/modelclass.htm...), > > > but those favourites may have been created by various users (hence the > > > parents would be different). > > > Can you use SS.get(keys) instead? > > > I think the reason that get_by_id requires the parent is because IDs > > are not full keys (and presumably an ID can be duplicated for > > different kinds). If you create Keys for them (you'll need to know > > their Kind), you could call SS.get(keys) and get them all in one go. > > -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appeng...@googlegroups.com. To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en.
[google-appengine] Re: Transaction across entities in different groups
Thanks Robert. The basic techniques I'm aware of for paging with a query object are either: 1) Order by key name (not useful for me since the order looks quasi- random; also you need to represent the starting key for the next page somehow, and it looks ugly in an url). 2) Sort by an indexed property that's invented for the purpose. An option perhaps, but probably not worth it for me to have the cost of another indexed property (both CPU time and size). Thanks for that blog link. It's exactly on the subject I'm talking about. Having read through it, it looks like the answer to my question is, "possible, but not easy". I'm leaning toward simply accepting that the data could get out of sync on very rare occasions, and to make the system self-repairing or at least resilient to bogus entries. It may seem crude but it's just photos we're talking about here, not a bank account. You asked about query time vs. size of returned elements. Some quick experimentation seems to indicate that the gap between the two methods is close to linear with the number of elements returned, meaning there isn't simply a fixed overhead for the query. I'm talking about API CPU time. The real time seems to fluctuate enough that it's hard to say. Thank you both for your input. On Jan 24, 12:42 am, Robert Kluin wrote: > You can page using the reference query. Since it is a query object, > just be sure that you order in a "stable" way, then you can use one of > the methods suggested in the app engine articles to page. Have you > checked to see if as the size of the list property grows get_by_id()'s > performance gets closer to the query's performance? > > Otherwise you might consider implementing something like Nick's > distributed > transactions:http://blog.notdot.net/2009/9/Distributed-Transactions-on-App-Engine > > Robert > > On Sat, Jan 23, 2010 at 9:14 PM, dburns wrote: > > Thanks for the reply. Using a ReferenceProperty from the SS to the > > User does at first sound ideal, but the problem is that the back- > > reference that this gives you is actually a Query object. So that > > would essentially put me back to where I was initially (as I described > > in the second-last paragraph). Using the Query seems to be a bit > > slower than fetching all the objects by a list of ids, but the more > > important difference is that the list of ids lends itself to paging (I > > simply chop the list of ids into groups of 20 or whatever). > > > One other optimization I get from using lists of ids (fav_ids and > > owned_ids) is that I can optimize the two get_by_id calls into a > > single call by concatenating the lists into one, then splitting the > > returned list since I know how many are favourites and how many are > > owned. There are a lot of things I like about the current set-up I > > have. The only thing I don't like is the lack of atomic operations. > > If I can't find a way out, I'll try to make the code more resilient to > > "corrupted" data just in case it ever happens. > > > I appreciate the feedback. > > > On Jan 23, 9:51 pm, Wooble wrote: > >> Why not use a ReferenceProperty pointing to a User in the SS model > >> instead of an unindexed StringProperty? The User model can then use > >> the backreference collection to get a list of photos owned by the > >> user. > > >> I don't see a problem with a simple ListProperty of favorites, > >> although making this a list of db.Keys instead of a list of integer > >> IDs is probably preferable, because if you do decide to make photos > >> children of another entity the IDs won't be globally unique while the > >> db.Keys still will be. > > >> On Jan 23, 7:29 pm, dburns wrote: > > >> > I'd appreciate any insight into the best design for this problem. I > >> > have a photo-sharing app. I want people to be able to mark a photo as > >> > a "favourite", just as YouTube does with videos. > > >> > I have two kinds: Snapshots and Users. Given a user, I need to be > >> > able to get a list of their favourites, and a list of the snapshots > >> > they created. I need the inverse too, i.e. given a snapshot, I need > >> > to know who made it. > > >> > The problem I have is ensuring consistency across these two kinds, but > >> > I don't think I can put them into the same entity group. First, here > >> > they are: > > >> > # The snapshot class. > >> > class SS(db.Model): > >> > owner = db.StringProperty(indexed=False) #
[google-appengine] Re: Transaction across entities in different groups
Thanks for the reply. Using a ReferenceProperty from the SS to the User does at first sound ideal, but the problem is that the back- reference that this gives you is actually a Query object. So that would essentially put me back to where I was initially (as I described in the second-last paragraph). Using the Query seems to be a bit slower than fetching all the objects by a list of ids, but the more important difference is that the list of ids lends itself to paging (I simply chop the list of ids into groups of 20 or whatever). One other optimization I get from using lists of ids (fav_ids and owned_ids) is that I can optimize the two get_by_id calls into a single call by concatenating the lists into one, then splitting the returned list since I know how many are favourites and how many are owned. There are a lot of things I like about the current set-up I have. The only thing I don't like is the lack of atomic operations. If I can't find a way out, I'll try to make the code more resilient to "corrupted" data just in case it ever happens. I appreciate the feedback. On Jan 23, 9:51 pm, Wooble wrote: > Why not use a ReferenceProperty pointing to a User in the SS model > instead of an unindexed StringProperty? The User model can then use > the backreference collection to get a list of photos owned by the > user. > > I don't see a problem with a simple ListProperty of favorites, > although making this a list of db.Keys instead of a list of integer > IDs is probably preferable, because if you do decide to make photos > children of another entity the IDs won't be globally unique while the > db.Keys still will be. > > On Jan 23, 7:29 pm, dburns wrote: > > > I'd appreciate any insight into the best design for this problem. I > > have a photo-sharing app. I want people to be able to mark a photo as > > a "favourite", just as YouTube does with videos. > > > I have two kinds: Snapshots and Users. Given a user, I need to be > > able to get a list of their favourites, and a list of the snapshots > > they created. I need the inverse too, i.e. given a snapshot, I need > > to know who made it. > > > The problem I have is ensuring consistency across these two kinds, but > > I don't think I can put them into the same entity group. First, here > > they are: > > > # The snapshot class. > > class SS(db.Model): > > owner = db.StringProperty(indexed=False) # Who created this (user > > id). > > #Other data about the snapshot here > > > # The user class (the key_name is the user id) > > class User(db.Model): > > owned_ids = db.ListProperty(int, indexed=False) # IDs of owned > > snapshots (i.e. created by this user) > > fav_ids = db.ListProperty(int, indexed=False) # IDs of > > favourite snapshots > > > The issue is that when a user either creates or deletes a snapshot, > > there's a potential for those to get out of sync if an exception > > happens just at the wrong moment (e.g. a snapshot could exist where > > the creating user doesn't have it in the owned_ids list). > > > Making a User instance the parent of a SS (snapshot) instance seems > > like a natural fit, except then I can't fetch all the favourites via: > > favs = SS.get_by_id(user.fav_ids). The reason is that all parents > > have to be the same to use SS.get_by_id (according > > tohttp://code.google.com/appengine/docs/python/datastore/modelclass.htm...), > > but those favourites may have been created by various users (hence the > > parents would be different). > > > Originally I had no owned_ids in User, and did a query to find that > > user's snapshots (owner in SS was indexed). But that was slow and > > didn't lend itself to paging. So I switched to get_by_id. > > > Documentation note: run_in_transaction > > athttp://code.google.com/appengine/docs/python/datastore/functions.html > > doesn't mention the restriction that entities have to be in the same > > group. I discovered it by seeing the exception, then read up in more > > detail elsewhere. I naturally started there so it probably should be > > mentioned. > > -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appeng...@googlegroups.com. To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en.
[google-appengine] Re: Transaction across entities in different groups
Thanks for the reply. Using a ReferenceProperty from the SS to the User does at first sound ideal, but the problem is that the back- reference that this gives you is actually a Query object. So that would essentially put me back to where I was initially (as I described in the second-last paragraph). Using the Query seems to be a bit slower than fetching all the objects by a list of ids, but the more important difference is that the list of ids lends itself to paging (I simply chop the list of ids into groups of 20 or whatever). One other optimization I get from using lists of ids (fav_ids and owned_ids) is that I can optimize the two get_by_id calls into a single call by concatenating the lists into one, then splitting the returned list since I know how many are favourites and how many are owned. There are a lot of things I like about the current set-up I have. The only thing I don't like is the lack of atomic operations. If I can't find a way out, I'll try to make the code more resilient to "corrupted" data just in case it ever happens. I appreciate the feedback. On Jan 23, 9:51 pm, Wooble wrote: > Why not use a ReferenceProperty pointing to a User in the SS model > instead of an unindexed StringProperty? The User model can then use > the backreference collection to get a list of photos owned by the > user. > > I don't see a problem with a simple ListProperty of favorites, > although making this a list of db.Keys instead of a list of integer > IDs is probably preferable, because if you do decide to make photos > children of another entity the IDs won't be globally unique while the > db.Keys still will be. > > On Jan 23, 7:29 pm, dburns wrote: > > > I'd appreciate any insight into the best design for this problem. I > > have a photo-sharing app. I want people to be able to mark a photo as > > a "favourite", just as YouTube does with videos. > > > I have two kinds: Snapshots and Users. Given a user, I need to be > > able to get a list of their favourites, and a list of the snapshots > > they created. I need the inverse too, i.e. given a snapshot, I need > > to know who made it. > > > The problem I have is ensuring consistency across these two kinds, but > > I don't think I can put them into the same entity group. First, here > > they are: > > > # The snapshot class. > > class SS(db.Model): > > owner = db.StringProperty(indexed=False) # Who created this (user > > id). > > #Other data about the snapshot here > > > # The user class (the key_name is the user id) > > class User(db.Model): > > owned_ids = db.ListProperty(int, indexed=False) # IDs of owned > > snapshots (i.e. created by this user) > > fav_ids = db.ListProperty(int, indexed=False) # IDs of > > favourite snapshots > > > The issue is that when a user either creates or deletes a snapshot, > > there's a potential for those to get out of sync if an exception > > happens just at the wrong moment (e.g. a snapshot could exist where > > the creating user doesn't have it in the owned_ids list). > > > Making a User instance the parent of a SS (snapshot) instance seems > > like a natural fit, except then I can't fetch all the favourites via: > > favs = SS.get_by_id(user.fav_ids). The reason is that all parents > > have to be the same to use SS.get_by_id (according > > tohttp://code.google.com/appengine/docs/python/datastore/modelclass.htm...), > > but those favourites may have been created by various users (hence the > > parents would be different). > > > Originally I had no owned_ids in User, and did a query to find that > > user's snapshots (owner in SS was indexed). But that was slow and > > didn't lend itself to paging. So I switched to get_by_id. > > > Documentation note: run_in_transaction > > athttp://code.google.com/appengine/docs/python/datastore/functions.html > > doesn't mention the restriction that entities have to be in the same > > group. I discovered it by seeing the exception, then read up in more > > detail elsewhere. I naturally started there so it probably should be > > mentioned. > > -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appeng...@googlegroups.com. To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en.
[google-appengine] Transaction across entities in different groups
I'd appreciate any insight into the best design for this problem. I have a photo-sharing app. I want people to be able to mark a photo as a "favourite", just as YouTube does with videos. I have two kinds: Snapshots and Users. Given a user, I need to be able to get a list of their favourites, and a list of the snapshots they created. I need the inverse too, i.e. given a snapshot, I need to know who made it. The problem I have is ensuring consistency across these two kinds, but I don't think I can put them into the same entity group. First, here they are: # The snapshot class. class SS(db.Model): owner = db.StringProperty(indexed=False) # Who created this (user id). #Other data about the snapshot here # The user class (the key_name is the user id) class User(db.Model): owned_ids = db.ListProperty(int, indexed=False) # IDs of owned snapshots (i.e. created by this user) fav_ids = db.ListProperty(int, indexed=False) # IDs of favourite snapshots The issue is that when a user either creates or deletes a snapshot, there's a potential for those to get out of sync if an exception happens just at the wrong moment (e.g. a snapshot could exist where the creating user doesn't have it in the owned_ids list). Making a User instance the parent of a SS (snapshot) instance seems like a natural fit, except then I can't fetch all the favourites via: favs = SS.get_by_id(user.fav_ids). The reason is that all parents have to be the same to use SS.get_by_id (according to http://code.google.com/appengine/docs/python/datastore/modelclass.html#Model_get_by_id), but those favourites may have been created by various users (hence the parents would be different). Originally I had no owned_ids in User, and did a query to find that user's snapshots (owner in SS was indexed). But that was slow and didn't lend itself to paging. So I switched to get_by_id. Documentation note: run_in_transaction at http://code.google.com/appengine/docs/python/datastore/functions.html doesn't mention the restriction that entities have to be in the same group. I discovered it by seeing the exception, then read up in more detail elsewhere. I naturally started there so it probably should be mentioned. -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appeng...@googlegroups.com. To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en.
[google-appengine] Re: Automating appcfg.py
Well syntactically, the "< gpass" should always be at the very end (and no, the name doesn't matter). I tried it myself just now and ran into some weird EOF (end of file) error when launching the usual way (i.e. with appcfg.py being the first command). After some experimentation I discovered that this somehow breaks the input redirection to python, but if you launch it by invoking the python executable followed by the script name (appcfg.py) it works OK. This worked for me: python "C:\Program Files\Google\google_appengine\appcfg.py" --passin -- email=[email] update [app_folder] < gpass That's on Windows. Of course adjust the path to where you have appcfg.py. On Jan 21, 3:30 pm, pythono wrote: > Hey thanks.. > > Does the password file have to have a particular extension? I saved a > text file containing only my google password, which i called "gpass". > Then I ran the command: > appcfg.py --email=[my email account] --passin < gpass update [my app > folder] > > it resulted in: > Server: appengine.google.com. > Scanning files on local disk. > Scanned 500 files. > Initiating update. > Invalid username or password. > Error 401: --- begin server output --- > Must authenticate first. > --- end server output --- > Password for [my email account] > > Thanks again, > Arjun > > On Jan 9, 1:34 pm, dburns wrote: > > > appcfg.py has this option that should help: > > > --passin Read the login password from stdin. > > > stdin means "standard input", which is normally what you type, but you > > can redirect stdin like this: > > > appcfg.py --passin other_parameters_here < password > > > where password is a file containing nothing but your password. > > > On Jan 6, 5:51 pm, pythono wrote: > > > > Hey there, > > > > I would like to run a shell script that does certain things first and > > > then updates my app engine application. How could I supply my > > > password to appcfg.py without manually entering it in? I'm kind of a > > > novice in shell scripting (i'm using OS X), so the more specific you > > > can be, the better. > > > > Thanks, > > > Arjun > > -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appeng...@googlegroups.com. To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en.
[google-appengine] Re: dev_appserver.py issue. When I deploy to app engine the code works fine. Please help
Looks like you are running Python26. I did initially too, but ran into troubles. Things have been smooth since I installed 2.5 instead (which is what they recommend). On Jan 13, 8:46 pm, adamjamesdrew same wrote: > 2010-01-13 20:44:04 Running command: "[u'C:\\Python26\\python.exe', > u'C:\\Program Files\\Google\\google_appengine\\dev_appserver.py', '-- > admin_console_server=', '--port=8081', u'C:\\Documents and Settings\ > \adam\\My Documents\\flockr\\appengine\\testapp\\engineapp']" > C:\Program Files\Google\google_appengine\google\appengine\tools > \appcfg.py:41: DeprecationWarning: the sha module is deprecated; use > the hashlib module instead > import sha > INFO 2010-01-14 01:44:05,280 py_zipimport.py:108] zipimporter('C:\ > \Python26\\lib\\site-packages\\simplejson-2.0.9-py2.6-win32.egg', > 'simplejson\\') > Traceback (most recent call last): > File "C:\Program Files\Google\google_appengine\dev_appserver.py", > line 67, in > run_file(__file__, globals()) > File "C:\Program Files\Google\google_appengine\dev_appserver.py", > line 63, in run_file > execfile(script_path, globals_) > File "C:\Program Files\Google\google_appengine\google\appengine\tools > \dev_appserver_main.py", line 82, in > from google.appengine.tools import appcfg > File "C:\Program Files\Google\google_appengine\google\appengine\tools > \appcfg.py", line 59, in > from google.appengine.tools import bulkloader > File "C:\Program Files\Google\google_appengine\google\appengine\tools > \bulkloader.py", line 112, in > from google.appengine.ext import key_range as key_range_module > File "C:\Program Files\Google\google_appengine\google\appengine\ext > \key_range\__init__.py", line 24, in > import simplejson > File "C:\Python26\lib\site-packages\simplejson-2.0.9-py2.6-win32.egg > \simplejson\__init__.py", line 108, in > File "C:\Program Files\Google\google_appengine\google\appengine\dist > \py_zipimport.py", line 213, in load_module > exec code in mod.__dict__ > File "C:\Python26\lib\site-packages\simplejson-2.0.9-py2.6-win32.egg > \simplejson\decoder.py", line 7, in > File "C:\Program Files\Google\google_appengine\google\appengine\dist > \py_zipimport.py", line 213, in load_module > exec code in mod.__dict__ > File "C:\Python26\lib\site-packages\simplejson-2.0.9-py2.6-win32.egg > \simplejson\scanner.py", line 5, in > File "C:\Program Files\Google\google_appengine\google\appengine\dist > \py_zipimport.py", line 213, in load_module > exec code in mod.__dict__ > File "C:\Python26\lib\site-packages\simplejson-2.0.9-py2.6-win32.egg > \simplejson\_speedups.py", line 7, in > File "C:\Python26\lib\site-packages\simplejson-2.0.9-py2.6-win32.egg > \simplejson\_speedups.py", line 4, in __bootstrap__ > File "C:\Python26\lib\site-packages\pkg_resources.py", line 882, in > resource_filename > self, resource_name > File "C:\Python26\lib\site-packages\pkg_resources.py", line 1352, in > get_resource_filename > return self._extract_resource(manager, zip_path) > File "C:\Python26\lib\site-packages\pkg_resources.py", line 1363, in > _extract_resource > zip_stat = self.zipinfo[zip_path] > File "C:\Program Files\Google\google_appengine\google\appengine\dist > \py_zipimport.py", line 268, in __getitem__ > info = _zipfile_cache[self._archive].getinfo(filename) > File "C:\Python26\lib\zipfile.py", line 821, in getinfo > 'There is no item named %r in the archive' % name) > KeyError: "There is no item named 'simplejson_speedups.pyd' in the > archive" > 2010-01-13 20:44:05 (Process exited with code 1) -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appeng...@googlegroups.com. To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en.
[google-appengine] Re: Datastore caching
Ignore the part "(to reduce space)" -- admittedly I was copying and pasting from a very similar reply I just made on another thread. :-) The basic idea applies, though. On Jan 13, 1:31 pm, dburns wrote: > Adding a column to a kind will affect any new entities, but won't > affect the existing data. To update the existing data (to reduce > space), you would have to traverse over all the entities and simply > get and put them. > > On Jan 13, 3:02 am, WallyDD wrote: > > > I have added a column to a table in the datastore and all works well > > on my local machine. > > On GAE using the datastore viewer I can see that the datastore has the > > new column and values. I can access/change these values no problem. > > I have one page which pretends that the new values don't exist. I can > > get around it temporarily by changing the fetch value, then all is > > good. > > > After some time(few hours) it reverts to picking up the old datastore > > (so it doesn't pick up the new values) which makes the page incorrect. > > Is there a way to flush this cache? or is there a feature which I am > > unaware of which is causing this. > > > I have had the problem occur for over 24 hours now. > > -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appeng...@googlegroups.com. To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en.
[google-appengine] Re: Datastore caching
Adding a column to a kind will affect any new entities, but won't affect the existing data. To update the existing data (to reduce space), you would have to traverse over all the entities and simply get and put them. On Jan 13, 3:02 am, WallyDD wrote: > I have added a column to a table in the datastore and all works well > on my local machine. > On GAE using the datastore viewer I can see that the datastore has the > new column and values. I can access/change these values no problem. > I have one page which pretends that the new values don't exist. I can > get around it temporarily by changing the fetch value, then all is > good. > > After some time(few hours) it reverts to picking up the old datastore > (so it doesn't pick up the new values) which makes the page incorrect. > Is there a way to flush this cache? or is there a feature which I am > unaware of which is causing this. > > I have had the problem occur for over 24 hours now. -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appeng...@googlegroups.com. To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en.
[google-appengine] Re: How I can predict or calculate overhead in Datastorage. For me now it looks like 900% or real stored data. Or its simple multiplayed by 10?
Every single one of those properties is indexed by default. Do you really need to be able to search or sort by every property? If not, add indexed=False to the parameter list for each property where an index isn't required. That will affect new entries, but won't affect the existing data. To update the existing data (to reduce space), you would have to traverse over all the entities and simply get and put them. On Jan 13, 6:09 am, Петр Воронов wrote: > Hi Google and all. > It's my third post about difference datastore size in Datastore > Statistics and Total Stored Data in Dashboard (so and in Quota or > Billing). > Now I have in Statistic: > Size of all entities 29 MBytes > In Total Stored Data > 29% - 0.29 of 1.00 GBytes > > So I have overhead - 900%. > I don't use any own created index. > I have only two type of Kind > > class MarketStats(db.Model): > typeID = db.IntegerProperty(required=True) > regionID = db.IntegerProperty() > solarSystemID = db.IntegerProperty(required=True) > updated = db.DateTimeProperty() > ordersSell = db.IntegerProperty(default=0) > minSellPrice = db.IntegerProperty(default=0) > maxSellPrice = db.IntegerProperty(default=0) > averageSellPrice = db.IntegerProperty(default=0) > medianSellPrice = db.IntegerProperty(default=0) > volSellRemaining = db.IntegerProperty(default=0) > volSellEntered = db.IntegerProperty(default=0) > newestSellOrder = db.DateProperty(default=None) > oldestSellOrder = db.DateProperty(default=None) > ordersBuy = db.IntegerProperty(default=0) > minBuyPrice = db.IntegerProperty(default=0) > maxBuyPrice = db.IntegerProperty(default=0) > averageBuyPrice = db.IntegerProperty(default=0) > medianBuyPrice = db.IntegerProperty(default=0) > volBuyRemaining = db.IntegerProperty(default=0) > volBuyEntered = db.IntegerProperty(default=0) > newestBuyOrder = db.DateProperty(default=None) > oldestBuyOrder = db.DateProperty(default=None) > > class MarketHistory(MarketStats): > date = db.DateProperty(required=True) > > All items of MarketStats work as Parents for some amount of MarketHistory > items. > > All of this has keys name build by > "%06d%09d" % (typeID,solarSystemID) for MarketStats > date.isoformat()+"%06d%09d" % (typeID,solarSystemID) for MarketHistory > > Please help how I can decrease datastore usage in Dashboard? > Which tips and tricks I can use? > Or it's simple multiply data from Statistic by 10 ? > > Best regards, > Chem. -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appeng...@googlegroups.com. To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en.
[google-appengine] Re: Automating appcfg.py
appcfg.py has this option that should help: --passin Read the login password from stdin. stdin means "standard input", which is normally what you type, but you can redirect stdin like this: appcfg.py --passin other_parameters_here < password where password is a file containing nothing but your password. On Jan 6, 5:51 pm, pythono wrote: > Hey there, > > I would like to run a shell script that does certain things first and > then updates my app engine application. How could I supply my > password to appcfg.py without manually entering it in? I'm kind of a > novice in shell scripting (i'm using OS X), so the more specific you > can be, the better. > > Thanks, > Arjun -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appeng...@googlegroups.com. To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en.
[google-appengine] Re: Using memcache effectively
Satoshi, thanks very much for the detailed description. That certainly gives me something to think about. I assume in your social network, there are a lot of parts of the display that are common to all users. In my case, a given user's page bears no resemblance to another user's page (other than the banner at the top) since it's basically a list of photos. So I'm not sure if the technique would work for me. BTW it sounds like you will get several requests (via AJAX) per page viewed, so I guess that drives up your request count? About memcache not helping much for customized output, I hear what you're saying, but I can at least use it to avoid hitting the datastore multiple times as a given user browses around. That's why I'm thinking I'd like to have the memcached data around for 10 minutes on the assumption that they will hang around that long. But after that 10 minutes, their cached data would just be competing for space with that of other users which is why I'm thinking of limiting its duration. And wondering if putting a 10-minute limit is going to help that situation. Thanks again! On Jan 6, 6:26 pm, Satoshi wrote: > It probably depends on the type of application, but memcache does not > help you much if you have a lot of customized output (which are > different > from one user to another). > > My application (social network application) has a very similar > requirement, > and I am solving this problem by doing following: > > 1. Break the data into pieces, so that each data is identical to all > the users, > 2. Assign a unique URL to each data, and cache them in memcache using > the URL as the key, > 3. Write some JavaScript code on the client side and construct the > custom page > by retrieving those data using AJAX (typically in JSON format, but you > can use HTML-let as well). > > In other word, the server side of code is mostly acting as Model (in > MVC), and > the JavaScript code on the client side is acting as Controller. > > I found that this architecture works really well with App Engine and > memcache, allowing me > to keep the cache hit rate very high (>95%) and the average access > time to below 100 cpu_ms. > > By the way, please aware that you still need to put some access > control on the server side (above memcache). > For example, the "profile of each user" can be accessed via "http:// > myapp.appspot.com/profile?uid={user_id}" > (which can be cached in memcache), you want to make it sure that only > friends can access that profile. > > Satoshi > > On Jan 5, 1:21 pm, dburns wrote: > > > My app may display a lot of different output for a given page > > (customized per user amongst other things), and I'm wondering how best > > to use memcache. > > > My current scheme is to estimate that a given user's output might be > > useful for, say 10 minutes, after which time the user has probably > > gone away so there's no point caching it. It's not that the content > > is invalid after 10 minutes (I explicitly clear memcache if content > > becomes invalid). Rather, I'm trying to aid memcache in determining > > what it can get rid of. > > > Is this a good idea, or am I interfering with memcache's algorithm? > > It may all depend on the exact details of the application, but I'm > > wondering if there any general advice on memcache usage. > > > Thanks. > > -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appeng...@googlegroups.com. To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en.
[google-appengine] Re: Timezones & Date Formatting
It's an existing issue. Star this one: http://code.google.com/p/googleappengine/issues/detail?id=822&can=5&colspec=ID%20Type%20Status%20Priority%20Stars%20Owner%20Summary%20Log%20Component On Jan 4, 2:16 pm, Evan Klitzke wrote: > I've noticed that app engine internally uses UTC everywhere. This is > OK in general, but I'm wondering how to get the logs (i.e. that you > access from the admin panel) in localtime. Is there a timezone that I > can associate with my Google account, and have the times automatically > converted to localtime? Or is it just displaying the literal time > string that the logging module generates (i.e. in which case I'd > probably have to change the root logging formatter to format the time > in my local timezone)? > > Thanks. > > -- > Evan Klitzke :wq -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appeng...@googlegroups.com. To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en.
[google-appengine] Using memcache effectively
My app may display a lot of different output for a given page (customized per user amongst other things), and I'm wondering how best to use memcache. My current scheme is to estimate that a given user's output might be useful for, say 10 minutes, after which time the user has probably gone away so there's no point caching it. It's not that the content is invalid after 10 minutes (I explicitly clear memcache if content becomes invalid). Rather, I'm trying to aid memcache in determining what it can get rid of. Is this a good idea, or am I interfering with memcache's algorithm? It may all depend on the exact details of the application, but I'm wondering if there any general advice on memcache usage. Thanks. -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appeng...@googlegroups.com. To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en.
[google-appengine] Re: how to use less time for many urlfetches?
Have you seen this? http://code.google.com/appengine/docs/python/urlfetch/asynchronousrequests.html On Jan 1, 10:25 am, saintthor wrote: > for each urlfetch, i have to wait till it returns. if i run many > urlfetches, how can i save the waiting time? > > i have tried thread, the urlfetches in threads still runs one after > another. -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appeng...@googlegroups.com. To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en.
[google-appengine] Re: How do i flush the memcache on production?
It's not currently available. You can vote up http://code.google.com/p/googleappengine/issues/detail?id=433, but in the meantime you'd have to write your own code to do a memcache.flush_all. On Dec 11, 2:17 pm, Evgeny wrote: > I can do it from my console.. but how can i do it from production? am > i missing something? > > thanks. -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appeng...@googlegroups.com. To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en.
[google-appengine] Re: Enable URL locally for testing purposes
One crude way to tell if you're running locally or not: if self.request.host_url != "http://localhost:8080": # Assume live server On Dec 8, 6:28 pm, Alex Popescu wrote: > On Dec 9, 1:25 am, Alex Popescu > wrote: > > > > > Hi guys, > > > I am wondering if there is a 'recommended' solution for enabling a set > > of URIs when the app is running locally for testing purposes. > > > Until recently I had a setup which was defining additional URI > > mappings in a _localsettings.py module and this was set for exclusion > > in app.yaml. > > But it looks like a change in the SDK is now ignoring all modules > > excluded in app.yaml so I lost this feature. > > > Any ideas? > > > tia, > > > ./alex > > In fact, I had many more overwriting constants/functions defined in > this module to allow me to debug/etc. All this seems to be gone. > > Is there a way to determine if the app is running locally or remotely > through an API call? I guess that would be the only way I could get > this features back. > > ./alex -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appeng...@googlegroups.com. To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en.
[google-appengine] Re: Catch All Doesn't Work
Your first url contain ".py"! You don't want .py in the url, do you? That would only match the url http://something.appspot.com/main2,py, which doesn't sound very likely. The second one should be a catch-all that invokes main.py. I use - url: /.* but I think yours should work. Maybe it's invoking your main.py and the fault lies with main.py. On Dec 7, 5:42 pm, Robert Kluin wrote: > I actually do not have anything directly mapped like your "main2.py," but I > do have "subsystem" handlers like: > - url: /user/.* > secure: always > script: /user/main.py > > Have you tried: > - url: main2.py > script: /main2.py > > My catchall looks like: > - url: .* > script: main.py > secure: optional > > Robert > > On Sun, Dec 6, 2009 at 8:30 PM, ustunozgur wrote: > > I have the following app.yaml file : > > > - url: /main2.py > > script: /main2.py > > - url: /.* > > script: main.py > > > I want to process URL/main2.py with main2.py, and all the other > > requests URL/blabla with main.py. > > > I can't get either working, I get 404 errors in the log in both > > scenarios. > > > 1. Shouldn't the /. pattern match anything except main2.py and process > > everything? > > 2. Doesn't /main2.py match main2.py in the root directory? > > > Am I doing something wrong? I have checked the line endings in > > app.yaml by the way, and they are Unix line endings. > > > Thanks, > > > -- > > > You received this message because you are subscribed to the Google Groups > > "Google App Engine" group. > > To post to this group, send email to google-appeng...@googlegroups.com. > > To unsubscribe from this group, send email to > > google-appengine+unsubscr...@googlegroups.com > > . > > For more options, visit this group at > >http://groups.google.com/group/google-appengine?hl=en. -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appeng...@googlegroups.com. To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en.
[google-appengine] Re: How can I pre-fill the email address when deploying my apps?
On the command line, you can use: appcfg.py --emai...@b.com update [dir] You still have to enter your password. But I find it only prompts me for my password every 24 hours. On Nov 27, 8:44 pm, samwyse wrote: > *sigh* > > Thanks for letting me know it isn't just me. Maybe someone from > Google will fix this. > > On Nov 27, 8:35 am, Brian wrote: > > > I got tired of that and went back to command line. After i run it > > once, then it is "up arrow followed by Enter" to start the upload, > > another "up arrow + Enter" to auto-fill the email field. And then I > > type in the password. It is much faster than the launcher. > > > From your command prompt move to "\program files\google > > \google_appengine" on pre-Win7 or "\program files (x86)\google > > \google_appengine" on Win7. Then issue the command > > > appcfg.py update [your app folder] > > > Brian -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appeng...@googlegroups.com. To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en.
[google-appengine] Re: preserve data during dev stage
By default, the development server should preserve data between runs. Are you sure you're not launching dev_appserver with the -c or -- clear_datastore flags? On Nov 27, 11:38 pm, james_027 wrote: > hi, > > How do i preserve my data during the development stage. Every time I > start my application all my previous inputted data are lost. > > thanks, -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appeng...@googlegroups.com. To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en.
[google-appengine] Watch out for this subtle performance killer
I thought I'd share this, since I'm sure there are others that have fallen into the same trap using this very common pattern (in this sample, Pix derives from db.Model; get_pics is called on every page load): def get_pics(self): pics = memcache.get("pics") if pics is None: pics = Pix.gql("LIMIT 100") memcache.add("pics", pics, 300) # Good for 5 minutes return pics See the bug? Here, memcache is actually HURTING performance since the overhead of memcache is there but it saves nothing at all. The query is still executed on every page load when the calling code iterates through the result. http://code.google.com/appengine/docs/python/datastore/queryclass.html#Introduction mentions this by saying "creating a new iterator from the Query object will re-execute the query", but it doesn't highlight this pitfall. The issue here is that entities are not fetched on the Pix.gql line. Instead, that simply returns a Query object. The results are actually fetched when the calling code begins to iterate (in Python-speak, the __iter__() method on the Query is what actually fetches entities). To fix this, you'd change the gql line to : pics = list(Pix.gql("LIMIT 100")) Putting a list() around the Pix.gql forces the query to happen at that moment. Then the list of entities is stored in memcache, not the Query object itself. I'm not sure if this applies to the Java API too, but it's worth a heads-up. Comments welcome... -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appeng...@googlegroups.com. To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/google-appengine?hl=.
[google-appengine] Re: Using handle_exception()
I'm not sure if this is the best way, since I'm pretty new to Python, but here's an example showing testing for CapabilityDisabledError. I think the exception you're looking for is called OverQuotaError. def handle_exception(self, exception, debug_mode): if isinstance(exception, CapabilityDisabledError): #Do something... On Nov 16, 4:34 pm, herbie <4whi...@o2.co.uk> wrote: > Hi GAE experts > > My handler class overrides handle_exception(exception, debug_mode) > to catch any exceptions not already dealt with. > > I'm sure I'm being dumb, but it's not obvious to me how test the > 'exception' argument to see if it is a 'CPU usage over quota' > exception? > > Thanks -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appeng...@googlegroups.com. To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/google-appengine?hl=.
[google-appengine] Do non-stored items in a Model hurt storage efficiency?
Questions related to the db.Model-derived class shown below. 1) Would the "not_stored" member variable hurt efficiency of storage in any way? Its purpose is for temporary work, and it is of course not persisted to storage. Would its presence slow down storage or retrieval at all? 2) Would the mere presence of "update_score" member function hurt efficiency? Having no clue as to how the underlying implementation works, it's hard for me to judge. Thanks in advance! class MyModel(db.Model): a = db.IntegerProperty() b = db.IntegerProperty() score = db.IntegerProperty() not_stored = 0 def update_score(self): self.score = self.a + self.b -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appeng...@googlegroups.com. To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/google-appengine?hl=.
[google-appengine] Re: Syntax Error while including javascript file
- url: /script/jquery\.js script: jquery.js should be - url: /script/jquery\.js static_dir: jquery.js the "script:" indicates what python or java script to invoke to handle the request. You simply want to hand the script to the browser that's requesting it. On Nov 12, 3:14 am, Felix wrote: > Hi, > handlers: > - url: /images > static_dir: images > > - url: /script/jquery\.js > script: jquery.js > > - url: / > script: main.py > > This is my app.yaml. I am trying to include the jquery.js file but I > am getting "exceptions.Syntaxerror". I am just including the file.Can > someone please guide me. > I have pasted my code and the error code for reference. > > Code where the error happens > self.response.out.write(' src='script/jquery.js'>') > > Error that I found in the log > : invalid syntax (jquery.js, line 1) > > Thanks!!! -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appeng...@googlegroups.com. To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/google-appengine?hl=.
[google-appengine] Re: UnicodeDecodeError when trying to upload unicode CSV file using upload_data
This monologue of mine might help: http://groups.google.com/group/google-appengine/browse_thread/thread/28324f17f9007af5/8c6b48f885f90ae1?hl=en#8c6b48f885f90ae1 My last post summarizes what I found out. On Oct 30, 10:06 pm, Gezim Hoxha wrote: > Hi, > > I'm trying to upload a CSV file to the datastore. I created the model and > bulkloader. However when I run the "appcfg.py upload_data" command I get > this error: > > ...[ERROR ] [Thread-7] WorkerThread: > Traceback (most recent call last): > File > "/Applications/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/google/appengine/tools/adaptive_thread_pool.py", > line 150, in WorkOnItems > status, instruction = item.PerformWork(self.__thread_pool) > File > "/Applications/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/google/appengine/tools/bulkloader.py", > line 671, in PerformWork > transfer_time = self._TransferItem(thread_pool) > File > "/Applications/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/google/appengine/tools/bulkloader.py", > line 826, in _TransferItem > self.content = self.request_manager.EncodeContent(self.rows) > File > "/Applications/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/google/appengine/tools/bulkloader.py", > line 1226, in EncodeContent > entity = loader.create_entity(values, key_name=key, parent=parent) > File > "/Applications/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/google/appengine/tools/bulkloader.py", > line 2443, in create_entity > properties[name] = converter(val) > File > "/Applications/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/google/appengine/api/datastore_types.py", > line 1055, in __new__ > return super(Text, cls).__new__(cls, arg, encoding) > UnicodeDecodeError: 'ascii' codec can't decode byte 0xc2 in position 511: > ordinal not in range(128) > [ERROR ] [Thread-5] WorkerThread: > Traceback (most recent call last): > File > "/Applications/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/google/appengine/tools/adaptive_thread_pool.py", > line 150, in WorkOnItems > status, instruction = item.PerformWork(self.__thread_pool) > File > "/Applications/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/google/appengine/tools/bulkloader.py", > line 671, in PerformWork > transfer_time = self._TransferItem(thread_pool) > File > "/Applications/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/google/appengine/tools/bulkloader.py", > line 826, in _TransferItem > self.content = self.request_manager.EncodeContent(self.rows) > File > "/Applications/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/google/appengine/tools/bulkloader.py", > line 1226, in EncodeContent > entity = loader.create_entity(values, key_name=key, parent=parent) > File > "/Applications/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/google/appengine/tools/bulkloader.py", > line 2443, in create_entity > properties[name] = converter(val) > File > "/Applications/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/google/appengine/api/datastore_types.py", > line 1055, in __new__ > return super(Text, cls).__new__(cls, arg, encoding) > UnicodeDecodeError: 'ascii' codec can't decode byte 0xc2 in position 51: > ordinal not in range(128) > > Your help would be greatly appreciated. > > -Gezim > > Gezim Hoxha (aka Gizmo)http://www.gizmobooks.com-- buy/sell your textbooks --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Understanding concurrent updates
I'm trying to make sure I understand issues surrounding multiple concurrent users. Take the example from http://code.google.com/appengine/docs/python/datastore/transactions.html#Uses_For_Transactions def increment_counter(key, amount): obj = db.get(key) obj.counter += amount obj.put() The text makes it clear that a naive implementation like that is susceptible to a collision between two concurrent users (meaning two separate instances of the python script running at the same time). The second put() clobbers the results of the first one. I'm unclear on how memcache fits in. If those two concurrent instances both instead call cached_obj=memcache.get("my_key") and get a result back from some earlier memcache.add("my_key", my_object), are they susceptible to the same sort of collision if each instance does something like cached_obj.counter += amount? I kind of thought memcache worked seamlessly across instances, but I don't see how in this case. Thanks for any clarification you can provide. --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Pagination with Ranking
And of course I just finished reading the fine print that says it'll only return a maximum of 1000. Oh well, maybe I stirred some creative juices in someone else. On Oct 2, 8:04 pm, dburns wrote: > Hi Josh, > > Seems like you can use the technique in the linked document, but > replace order("-when") with order("-score"), so I guess the real issue > is how to display the rank (I assume you mean first place, second > place, etc.). If displaying pages of players, I think you can just > use a counter (in Python code, not the data store) that increments > with each displayed name. > > I guess the real challenge is, when talking to a specific player, how > do you tell them that they are ranked 5th. I'm pretty new to GAE, > but I'd probably use a query that selected records that were less than > the current user's score, and then get the count (ignoring ties). > > http://code.google.com/appengine/docs/python/datastore/queryclass.htm... > > On Oct 2, 5:40 pm, Josh wrote: > > > Hi all, > > > I've seen a number of interesting discussions on how to do efficient > > pagination on appengine, but one thing that I haven't seen addressed, > > which I would like, is how to efficiently compute ranking information > > for the items that are paginated. > > > Let me motivate this discussion by considering a leader board. We > > have a model like so: > > > class Player(db.Model): > > name = db.StringProperty() > > score = db.IntegerProperty() > > last_updated = db.DateTimeProperty(auto_now=True) > > > We may have hundreds of thousands of Players and we want to be able to > > rank them by their score. > > > How could we used the previously discussed pagination techniques > > (http://code.google.com/appengine/articles/paging.html) to efficiently > > page through the leader board and not only show the Players in order > > by score, but also assign a rank value to each player. > > > Many thanks for your thoughts and input. > > > Cheers, > > Josh > > --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Pagination with Ranking
Hi Josh, Seems like you can use the technique in the linked document, but replace order("-when") with order("-score"), so I guess the real issue is how to display the rank (I assume you mean first place, second place, etc.). If displaying pages of players, I think you can just use a counter (in Python code, not the data store) that increments with each displayed name. I guess the real challenge is, when talking to a specific player, how do you tell them that they are ranked 5th. I'm pretty new to GAE, but I'd probably use a query that selected records that were less than the current user's score, and then get the count (ignoring ties). http://code.google.com/appengine/docs/python/datastore/queryclass.html#Query_count On Oct 2, 5:40 pm, Josh wrote: > Hi all, > > I've seen a number of interesting discussions on how to do efficient > pagination on appengine, but one thing that I haven't seen addressed, > which I would like, is how to efficiently compute ranking information > for the items that are paginated. > > Let me motivate this discussion by considering a leader board. We > have a model like so: > > class Player(db.Model): > name = db.StringProperty() > score = db.IntegerProperty() > last_updated = db.DateTimeProperty(auto_now=True) > > We may have hundreds of thousands of Players and we want to be able to > rank them by their score. > > How could we used the previously discussed pagination techniques > (http://code.google.com/appengine/articles/paging.html) to efficiently > page through the leader board and not only show the Players in order > by score, but also assign a rank value to each player. > > Many thanks for your thoughts and input. > > Cheers, > Josh --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: How to NOT index a field
Excellent. Many thanks. --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: How to NOT index a field
Thanks Nick! Ah, I was convinced it had to be in index.yaml. OK, that's easy. Do you consider it good practise to do this, or is it naive to think it will help? That's interesting about StringProperty vs TextProperty. I'd assumed there was a different storage mechanism, but it sounds like the two names are essentially there for convenience. DB --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: How to NOT index a field
I mean "vaguely recall", not "vaguely require", sorry. --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] How to NOT index a field
If I have something like: class MyModel(db.Model): a = db.StringProperty() b = db.StringProperty() and I know that I will never look up an entry by 'b', only by 'a', is it good practise to prevent the automatic indexing of 'b'? Presumably that would save space, and time on updates. Second question is how? I vaguely require seeing some way to prevent indexing on a field, but I've been scouring the indexing docs and can't find it now. Thanks. --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: POST data causes error: type 'exceptions.UnicodeDecodeError'>: 'ascii' codec can't decode byte X in position Y: ordinal not in range(128)
Here's what I found out, in case it helps someone. Turns out I mis- read the stack trace. The exception happened AFTER calling my handler, not before (trace messages in my handler didn't show up because of buffering, reinforcing my incorrect theory that it hadn't been invoked). The issue was that in this one case I happened to be emitting some valid text that contained the 0xc3 byte. After my handler was invoked, the framework was trying to gather the output using a StringIO, which contains this warning: The StringIO object can accept either Unicode or 8-bit strings, but mixing the two may take some care. If both are used, 8-bit strings that cannot be interpreted as 7-bit ASCII (that use the 8th bit) will cause a UnicodeError to be raised when getvalue() is called. and that's exactly what happened to me. Turns out because I was inadvertently emitting a unicode string at one point, followed later by the 0xc3 byte, it caused the exception. Simply by wrapping the unicode string in str(variable_name), I was able to make the problem go away. I'm not 100% sure I understand why, but emitting u"UTEST" causes the exception but "TEST" does not (when the 0xc3 byte is also present somewhere in the output). It's a bit scary just how easy it is to lay a really obscure trap for yourself like this. I have to be very careful not to emit any unicode strings, it seems. Any guidance/advice appreciated. --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Time zone in admin console
Thanks Nick. I went looking and discovered the issue tracking system, and I believe this is already logged: http://code.google.com/p/googleappengine/issues/detail?id=822 So I guess I'll wait :) Thanks for your reply. --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: POST data causes error: type 'exceptions.UnicodeDecodeError'>: 'ascii' codec can't decode byte X in position Y: ordinal not in range(128)
Here's an update. I found out how to do basic CGI so I was able to examine the POST data directly. Interestingly, I got no errors decoding the POST data as ascii! Does this point to a bug in the run_wsgi_app code? Here's my new mainline. The output I got (minus the very long POST data) is "is_ascii: True No error decoding as ascii". def is_ascii(s): return all(ord(c) in range(128) for c in s) def main(): print "Content-Type: text/html" # HTML is following print # blank line, end of headers s = sys.stdin.read() print 'DATA: ' print s print 'is_ascii: ' + str(is_ascii(s)) try: s.decode('ascii') except UnicodeDecodeError: print "UnicodeDecodeError: not a ascii-encoded unicode string" else: print "No error decoding as ascii" # run_wsgi_app(application) --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] POST data causes error: type 'exceptions.UnicodeDecodeError'>: 'ascii' codec can't decode byte X in position Y: ordinal not in range(128)
Hi, My app normally works fine, but certain POST data is causing it to blow up with a 500 (internal server error). The log contains the message in the stack trace below. Trouble is, my app hasn't even got started yet! It happens right at the run_wsgi_app line. My webapp.RequestHandler has not received control yet, so what can I do? The error mentions ascii, but I don't know why it assumes the POST data was ascii. It's probably UTF-8. I say probably since the POST data isn't directly in my control (it's from Facebook, since this is a Facebook app). Some thoughts: 1) Is there a bug in the run_wsgi_app code? Seems to me that data arriving from outside shouldn't be able to cause an internal server error. 2) Could the POST data be malformed? I don't know much about POST data -- perhaps it's identified as ascii, in error? I'd like to trap it and examine it but I don't know how to do that without the help of webapp.RequestHandler (the app dies before this gets invoked). My code is the standard mainline code def main(): run_wsgi_app(application) if __name__ == "__main__": main() Here's the stack trace from my log. Thanks for any help! : 'ascii' codec can't decode byte 0xc3 in position 10988: ordinal not in range(128) Traceback (most recent call last): File "/base/data/home/apps/(my directory)/index.py", line 216, in main() File "/base/data/home/apps/(my directory)/index.py", line 213, in main run_wsgi_app(application) File "/base/python_lib/versions/1/google/appengine/ext/webapp/ util.py", line 76, in run_wsgi_app result = application(env, _start_response) File "/base/python_lib/versions/1/google/appengine/ext/webapp/ __init__.py", line 521, in __call__ response.wsgi_write(start_response) File "/base/python_lib/versions/1/google/appengine/ext/webapp/ __init__.py", line 241, in wsgi_write body = self.out.getvalue() File "/base/python_dist/lib/python2.5/StringIO.py", line 270, in getvalue self.buf += ''.join(self.buflist) --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Time zone in admin console
Hi, All the timestamps in the admin console logs seem to be the PST time zone. Is there any way to adjust the time zone that times are displayed in, so I don't have to do mental arithmetic all the time? Thanks! --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Google I/O 2009 - ..Scalable, Complex Apps on App Engine (audio problems)
The audio on that video is barely audible for me, too. --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Unable to Run the Development Web Server
Looks like the ".py" extension is associated with the text editor rather than python.exe. For me, the installation took care of that, but you can change it manually via through Windows. Alternatively, you can explicitly put "python.exe" in front of the command so that it runs the script rather than launching the editor. Incidentally, I highly recommend you uninstall 2.6 and install 2.5. I made the same mistake initially and assumed 2.6 was 'better' than 2.5. There are enough differences between 2.6 and 2.5 to make it troublesome using 2.6. On Jun 10, 2:02 pm, bryanda wrote: > I have installed at "C:\Program Files\Google\google_appengine" and "C: > \Python26" > > I follow the steps > athttp://code.google.com/appengine/docs/python/tools/devserver.html > > I created "helloworld" folder at "C:\Program Files\Google > \google_appengine\helloworld" and inside the folder there are two > files which are "helloworld.py" and "app.yaml". > The codes are created from the steps > athttp://code.google.com/appengine/docs/python/gettingstarted/helloworl... > > My problem is > i try in command prompt, > google_appengine/dev_appserver.py helloworld/ > but it gives me error: > 'google_appengine' is not recognized as an internal or external > command, operable program or bath file. > > i try using > dev_appserver.py helloworld/ > but it opens up dev_appserver.py in text editor. > > Please tell me where is my mistake. And may i know when do i use the > program Python Interactive shell or PythonWin from "C:\Python26" ? > > Please help me --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---