[google-appengine] www sub domain redirect to another domain
Hi, all the visit to the www domain are redirected to another sub domain today. It worked well before. What's going on? How can I solve it? Thanks --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Google App Engine group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: query to get a resultset matching multiple attributes
Thank you! On Mar 13, 6:23 pm, ryan ryanb+appeng...@google.com wrote: On Mar 13, 12:16 pm, adelevie adele...@gmail.com wrote: routes = db.GqlQuery(SELECT * FROM Route WHERE code = 'BL' OR code = 'WL') Apperently Gql doesn't like that OR. try SELECT * FROM Route WHERE code IN ('BL', 'WL'). http://code.google.com/appengine/docs/python/datastore/gqlreference.html --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Google App Engine group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: app-engine-patch MediaGenerator not generating any media!
On Mar 13, 6:03 pm, capoista stewyma...@yahoo.co.uk wrote: 'combined-%(LANGUAGE_CODE)s.js': ( 'myapp/bla.js', 'global/morecode.js', ), I suppose both files don't actually exist? In that case the media generation process will fail. 'combined-%(LANGUAGE_DIR)s.css': ( 'myapp/style.css', If you only integrate myapp/style.css why did you create another style.css in the global media folder? 'global/layout-%(LANGUAGE_DIR)s.css', Does this file exist, at all? Bye, Waldemar Kornewald --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Google App Engine group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: large imports/exports
No I haven't. On Mar 13, 10:46 pm, Amr Ellafi amrl...@gmail.com wrote: really , great ! have you sliced your data or did the upload at once ?? On Fri, Mar 13, 2009 at 10:47 PM, Let Delete My Apps davide.rogn...@gmail.com wrote: I've imported 87811 records using bulkload_client.py . . . On Mar 13, 7:53 pm, Ronn Ross ronn.r...@gmail.com wrote: Do you know if there's an easy way to do large imports/exports of data as needed for Google Appengine? --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Google App Engine group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Customize Over Quota Page
Ok :-) User Feature Request http://code.google.com/p/googleappengine/issues/detail?id=1145q=customizecolspec=ID%20Type%20Status%20Priority%20Stars%20Owner%20Summary%20Log%20Component . . . On Mar 13, 9:37 pm, Let Delete My Apps davide.rogn...@gmail.com wrote: Why not? On Mar 12, 9:31 pm, Let Delete My Apps davide.rogn...@gmail.com wrote: Using the JSON/JSONP services is useful this response: { errorMsg : Over quota, errorCode : 403 } See other:http://groups.google.com/group/google-appengine/browse_thread/thread/... Is it possible? --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Google App Engine group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] help me about Django' url setting!!!!!!!
in Django,if i request a url like http://www.abc.com/login; it will auto redirect to: http://www.abc.com/login/; added / in the end. how to change this redirection? thx! --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Google App Engine group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: save a file in the harddisk?
thx! On Feb 18, 1:20 am, Geoffrey Spear geoffsp...@gmail.com wrote: no. use the datastore. On Feb 17, 2:01 am, gxtiou gxt...@gmail.com wrote: can gae allow us to save a file in the harddisk? --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Google App Engine group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: help me about Django' url setting!!!!!!!
i don't want it to redirect to: http://www.abc.com/login/; added / in the end. --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Google App Engine group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Deployments Quota and Billing Settings
I have a need to go over the 250 Deployments per day. Are there any plans to increase the Deployment quota limit through the Billing Settings? For example, could one pay to increase this limit? I'm assuming that every upload is being counted as a Deployment? This number seems to correspond with my uploads. Working in the SDK would not be an option for this application, since it's the Internet transactions between multiple sites that need to be tested, fixed by recoding, then reloaded etc. Regards, Mike Chirico --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Google App Engine group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] logging level
Any guidelines for how much logging we want to leave in our production code? specifically: - is the overhead of debug logging significant enough to worry about - is there an easy way to disable log messages below a certain level - do I need to do this in every script file, or is there something I can do inside app.yaml - is this significantly less efficient than removing the log calls (e.g. how fast does it short-circuit) - do any of these answers change based on how active the app is. e.g. the log call itself may be efficient, but when the amount of data being logged becomes large I could imagine that the infrastructure code to clean up old records to free up space could be non-trivial and have some performance impacts thanks, Adam --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Google App Engine group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Please, tell where to read about imports
Hi, I'm new to Python and have problems with import script files. Please help me to overcome it. Here is structure of my application: # \ \main.py \app.yaml \controller \controller\index.py (has class IndexPage) \controller\portfolio.py (has class PortfolioPage) \html\some_views_for_controllers \img\some_images \css\some_css Here is my app.yaml: # application: supercoolapp version: 1 runtime: python api_version: 1 handlers: - url: /css static_dir: css - url: /img static_dir: img - url: /.* script: main.py here is main.py code: # from google.appengine.ext import webapp from google.appengine.ext.webapp.util import run_wsgi_app from controller import * # MAIN CONFIGURATION # # Register the URL with the responsible classes application = webapp.WSGIApplication([ ('/', IndexPage), ('/ portfolio', PortfolioPage) ], debug=True) # Register the wsgi application to run def main(): run_wsgi_app(application) if __name__ == __main__: main() And I get exception whilr trying to see first page of my app: 2 from google.appengine.ext.webapp.util import run_wsgi_app 3 from controller import * 4 5 6 # MAIN CONFIGURATION # controller undefined type 'exceptions.ImportError': No module named controller args = ('No module named controller',) message = 'No module named controller' So it tells me that there is no module. I've read this thing: http://docs.python.org/tutorial/modules.html#importing-from-a-package What do I do wrong? Please tell me. Sorry for stupid question. --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Google App Engine group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Please, tell where to read about imports
Yes, it's silly question. I've added empty file __init__.py to folder controller \controller\index.py (has class IndexPage) \controller\portfolio.py (has class PortfolioPage) now I get other error: *type 'exceptions.NameError'*: name 'IndexPage' is not defined args = (name 'IndexPage' is not defined,) message = name 'IndexPage' is not defined So, module is ready, but my classes are not visible. What do I have to do next to make them visible? 2009/3/14 Serega.Sheypak serega.shey...@gmail.com Hi, I'm new to Python and have problems with import script files. Please help me to overcome it. Here is structure of my application: # \ \main.py \app.yaml \controller \controller\index.py (has class IndexPage) \controller\portfolio.py (has class PortfolioPage) \html\some_views_for_controllers \img\some_images \css\some_css Here is my app.yaml: # application: supercoolapp version: 1 runtime: python api_version: 1 handlers: - url: /css static_dir: css - url: /img static_dir: img - url: /.* script: main.py here is main.py code: # from google.appengine.ext import webapp from google.appengine.ext.webapp.util import run_wsgi_app from controller import * # MAIN CONFIGURATION # # Register the URL with the responsible classes application = webapp.WSGIApplication([ ('/', IndexPage), ('/ portfolio', PortfolioPage) ], debug=True) # Register the wsgi application to run def main(): run_wsgi_app(application) if __name__ == __main__: main() And I get exception whilr trying to see first page of my app: 2 from google.appengine.ext.webapp.util import run_wsgi_app 3 from controller import * 4 5 6 # MAIN CONFIGURATION # controller undefined type 'exceptions.ImportError': No module named controller args = ('No module named controller',) message = 'No module named controller' So it tells me that there is no module. I've read this thing: http://docs.python.org/tutorial/modules.html#importing-from-a-package What do I do wrong? Please tell me. Sorry for stupid question. --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Google App Engine group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Please, tell where to read about imports
Hi Serega, I'am not really sure about, but __init__.py defines a package. In your case you have a package called controller. This package contains index(.py) and portfolio(.py), which are modules. So, to make your app work, try this: here is main.py code: # from google.appengine.ext import webapp from google.appengine.ext.webapp.util import run_wsgi_app # Instruction which import modules index and portfolio from controller import * # MAIN CONFIGURATION # # Register the URL with the responsible classes # This should do the trick application = webapp.WSGIApplication([ ('/', index.IndexPage), ('/ portfolio', portfolio.PortfolioPage) ], debug=True) It may also be possible to write the import statements for IndexPage and PortfolioPage classes into the __init__.py file. Hope this helps, acm. signature.asc Description: OpenPGP digital signature
[google-appengine] Re: A question for Jaiku's developers, if they're watching..
Just a head's up - Jaiku has gone open source :) http://code.google.com/p/jaikuengine/ At a very brief first glance, I see references to xmpp stuff and more..going to try and map out the code and see what goodies might be there, could be stuff of interest beyond pub/sub too. On Mar 13, 1:28 pm, peterk peter.ke...@gmail.com wrote: Unfortunately I do need to query them based on subscriber_id..so I can't pack them into a non-indexed property. Retrieving updates particular user has subscribed to is blazingly fast though...that's the gain in the end, I can query and fetch 1000 updates for a user sorted by date in 20-30ms-cpu. Love that :p In my hacky approaches previously where I tried to write once and then 'gather', I had to do lots of in-memory sorting and stuff, and even the results often wouldn't be totally accurate. I'm going to keep toying with the write end of things though..because in my full app, I may need to do write to other entities along with subscribers to do certain things I'm trying to achieve. So I'm going to be looking for every opportunity possible to optimise the cost of an 'update', which in my case may go beyond notifying subscribers. So any thoughts/ideas on further optimisation are more than welcome!! @Paul If you've more subscribers than will fit in one 'group' you'll need multiple groups, correct. So you'll have n writes, where n = number of subscribers/group-size, rounded up to the nearest whole number. Even with the costly index creation for each of these 'group' entities though, it should still work out a fair bit cheaper than writing a seperate entity for each subscriber. On Mar 13, 11:47 am, bFlood bflood...@gmail.com wrote: @peterk - if you don't need to query by the subscriber, you could alternatively pack the list of subscribers for a feed into a TextProperty so it is not indexed. I use TextProperty a lot to store large lists of geometry data and they work out pretty well @brett - async! looking forward to it in future GAE builds. thanks cheers brian On Mar 13, 5:37 am, peterk peter.ke...@gmail.com wrote: I was just toying around with this idea yesterday Brett.. :D I did some profiling, and it would reduce the write cost per subscriber to about 24ms-40ms (depending on the number of subscribers you have..more = lower cost per avg), from 100-150ms. These are rough numbers with entities I was using, I have to do some more accurate profiling.. When I first thought about doing this, I was thinking :o I'll reduce write cost by a factor of hundreds!, but as it turns out, the extra index update time for an entity with a large number of list property entries eats into that saving significantly. But it still is a saving. Funnily enough the per subscriber saving increases (to a point) the more subscribers you have. I'm not sure if there's anything one can do to optimise index creation time with large lists.. I'm going to do some more work as well to see if there's an optimum 'batch size' for grouping subscribers together..at first blush, as mentioned above, it seems the larger the better (up to the per entity property/index cap of course). Thanks also for the insight on pubsubhubub..I eagerly await updates on that front :) Thank you!! On Mar 13, 8:05 am, Paul Kinlan paul.kin...@gmail.com wrote: Just Curious, For other pub/sub-style systems where you want to write to the Datastore, the trick is to use list properties to track the subscribers you've published to. So for instance, instead of writing a single entity per subscriber, you write one entity with 1000-2000 subscriber IDs in a list. Then all queries for that list with an equals filter for the subscriber will show the entity. This lets you pack a lot of information into a single entity write, thus minimizing Datastore overhead, cost, etc. Does that make sense? So if you have over the 5000 limit in the subscribers would you write the entity twice? Each with differnt subscriber id's? Paul 2009/3/13 Brett Slatkin brett-appeng...@google.com Heyo, Good finds, peterk! pubsubhubbub uses some of the same techniques thatJaikuuses for doing one-to-many fan-out of status message updates. The migration is underway as we speak (http://www.jaiku.com/blog/2009/03/11/upcoming-service-break/). I believe the code should be available very soon. 2009/3/11 peterk peter.ke...@gmail.com: The app is actually live here: http://pubsubhubbub.appspot.com/ http://pubsubhubbub-subscriber.appspot.com/ (pubsubhubbub-publisher isn't there, but it's trivial to upload your own.) This suggests it's working on appengine as it is now. Been looking through the source, and I'm not entirely clear on how the 'background workers' are actually working..there are two, one for pulling updates
[google-appengine] Release Status of Google App Engine
Is GAE still a preview release? I guess that I would have thought that when billing was enabled that it would have lost its preview release status. Thanks, Doug --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Google App Engine group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: large imports/exports
What about exporting. Is it just as easy to export your data out of appengine? On Sat, Mar 14, 2009 at 6:25 AM, Let Delete My Apps davide.rogn...@gmail.com wrote: No I haven't. On Mar 13, 10:46 pm, Amr Ellafi amrl...@gmail.com wrote: really , great ! have you sliced your data or did the upload at once ?? On Fri, Mar 13, 2009 at 10:47 PM, Let Delete My Apps davide.rogn...@gmail.com wrote: I've imported 87811 records using bulkload_client.py . . . On Mar 13, 7:53 pm, Ronn Ross ronn.r...@gmail.com wrote: Do you know if there's an easy way to do large imports/exports of data as needed for Google Appengine? --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Google App Engine group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Release Status of Google App Engine
Google services and products are often held in seemingly perpetual beta :p I wouldn't worry about the designation too much. Billing was a major step in making it a service usable for large, heavy-use, popular apps, and perhaps makes it less easy for Google to hide behind the 'preview' moniker when it comes to certain basic expectations about how the service will perform etc. But GAE still doesn't have quality-of- service guarantees or a SLA..I wouldn't expect them to drop the preview label until something like that is added, at the very earliest. I think google knows something has to be of 'merchantable quality' before they can start charging for it..so I think the introduction of billing was an expression of their confidence in their ability to maintain a reasonable, sellable, level of service quality. From our point of view, if we're paying customers, it gives us more leeway to complain when things go wrong, and Google knows that. On Mar 14, 4:25 pm, Doug doug...@gmail.com wrote: Is GAE still a preview release? I guess that I would have thought that when billing was enabled that it would have lost its preview release status. Thanks, Doug --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Google App Engine group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: logging level
Hi Adam, I have not tried this in GAE, but you should be able control the logging level (and a whole lot more) in all of your scripts via a single configuration file. You will have to add a line to all of your scripts that point to the file, but after that the logging level is controlled by that configuration file. (One change to the configuration file and a redeploy would change the logging in all of your scripts.) Take a look at this site for information on how to do this: http://docs.python.org/library/logging.html On the performance/overhead question - I usually try to limit any logging statements that would log large amounts of information to except code blocks in my production code. My logic behind that is that if the code has already had an exception that the user experience is already diminished, so taking a bit of extra time to make sure that I have the documentation to fix the problem is beneficial to their future experience. Hope this helps. Doug On Mar 14, 8:39 am, Adam adamsplug...@gmail.com wrote: Any guidelines for how much logging we want to leave in our production code? specifically: - is the overhead of debug logging significant enough to worry about - is there an easy way to disable log messages below a certain level - do I need to do this in every script file, or is there something I can do inside app.yaml - is this significantly less efficient than removing the log calls (e.g. how fast does it short-circuit) - do any of these answers change based on how active the app is. e.g. the log call itself may be efficient, but when the amount of data being logged becomes large I could imagine that the infrastructure code to clean up old records to free up space could be non-trivial and have some performance impacts thanks, Adam --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Google App Engine group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: urgent help needed:cann't update to GAE via http proxy
yes, i did as http://code.google.com/appengine/docs/python/tools/uploadinganapp.html#Using_an_HTTP_Proxy ,but it doesn't work, i cann't update succesuffly that way On Mar 13, 9:50 pm, Sharp-Developer.Net alexander.trakhime...@gmail.com wrote: Try to search on this group or in Google.com - it was discussed many times here. -- Alex On Mar 13, 6:57 am, Coonay fla...@gmail.com wrote: cmdset HTTP_PROXY=http://cache.mycompany.com:3128 cmdpython appcfg.py update myapp URLError: urlopen error [Errno 10061] No connection could be made because the t arget machine actively refused it Traceback (most recent call last): File C:\Program Files\Google\google_appengine\appcfg.py, line 60, in module run_file(__file__, globals()) File C:\Program Files\Google\google_appengine\appcfg.py, line 57, in run_fil e execfile(script_path, globals_) File C:\Program Files\Google\google_appengine\google\appengine \tools \appcfg.p y, line 1976, in module main(sys.argv) File C:\Program Files\Google\google_appengine\google\appengine \tools \appcfg.p y, line 1967, in main result = AppCfgApp(argv).Run() File C:\Program Files\Google\google_appengine\google\appengine \tools \appcfg.p y, line 1418, in Run self.action(self) File C:\Program Files\Google\google_appengine\google\appengine \tools \appcfg.p y, line 1879, in __call__ return method() File C:\Program Files\Google\google_appengine\google\appengine \tools \appcfg.p y, line 1669, in Update lambda path: open(os.path.join(basepath, path), rb)) File C:\Program Files\Google\google_appengine\google\appengine \tools \appcfg.p y, line 1213, in DoUpload missing_files = self.Begin() File C:\Program Files\Google\google_appengine\google\appengine \tools \appcfg.p y, line 1009, in Begin version=self.version, payload=self.config.ToYAML()) File C:\Program Files\Google\google_appengine\google\appengine \tools \appengin e_rpc.py, line 312, in Send self._Authenticate() File C:\Program Files\Google\google_appengine\google\appengine \tools \appengin e_rpc.py, line 344, in _Authenticate super(HttpRpcServer, self)._Authenticate() File C:\Program Files\Google\google_appengine\google\appengine \tools \appengin e_rpc.py, line 233, in _Authenticate auth_token = self._GetAuthToken(credentials[0], credentials[1]) File C:\Program Files\Google\google_appengine\google\appengine \tools \appengin e_rpc.py, line 177, in _GetAuthToken response = self.opener.open(req) File C:\PROGRA~1\Python26\lib\urllib2.py, line 383, in open response = self._open(req, data) File C:\PROGRA~1\Python26\lib\urllib2.py, line 401, in _open '_open', req) File C:\PROGRA~1\Python26\lib\urllib2.py, line 361, in _call_chain result = func(*args) File C:\PROGRA~1\Python26\lib\urllib2.py, line 1138, in https_open return self.do_open(httplib.HTTPSConnection, req) File C:\PROGRA~1\Python26\lib\urllib2.py, line 1105, in do_open raise URLError(err) urllib2.URLError: urlopen error [Errno 10061] No connection could be made becau se the target machine actively refused it PS:the version of gae is 1.1.9, python is 2.6 --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Google App Engine group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: A question for Jaiku's developers, if they're watching..
By the way, we do something very similar at Sponty (http:// www.thesponty.com/boston). We're adding a new feature whereby when one of your friends posts an event, we notify his or her friends via email, sms or twitter. It would take too long to process those communique right away. So we just write a PendingNotification to the datastore. All clients make a periodic ajax call to the server. We call that presence. Each presence call processes a set of PendingNotifications in a transaction to ensure we do not email people more than once. Since the UI is not waiting on the result of the presence call, it can take as long as possible. -Mahmoud On Mar 13, 2:23 am, Brett Slatkin brett-appeng...@google.com wrote: Heyo, Good finds, peterk! pubsubhubbub uses some of the same techniques that Jaiku uses for doing one-to-many fan-out of status message updates. The migration is underway as we speak (http://www.jaiku.com/blog/2009/03/11/upcoming-service-break/). I believe the code should be available very soon. 2009/3/11 peterk peter.ke...@gmail.com: The app is actually live here: http://pubsubhubbub.appspot.com/ http://pubsubhubbub-subscriber.appspot.com/ (pubsubhubbub-publisher isn't there, but it's trivial to upload your own.) This suggests it's working on appengine as it is now. Been looking through the source, and I'm not entirely clear on how the 'background workers' are actually working..there are two, one for pulling updates to feeds from publishers, and one for propogating updates to subscribers in batches. But like I say, I can't see how they're actually started and running constantly. There is a video here of a live demonstration: http://www.veodia.com/player.php?vid=fCNU1qQ1oSs The background workers seem to be behaving as desired there, but I'm not sure if they were just constantly polling some urls to keep the workers live for the purposes of that demo, or if they're actually running somehow constantly on their own.. I can't actually get the live app at the urls above to work, but not sure if it's because background workers aren't really working, or because i'm feeding it incorrect urls/configuration etc. Ah sorry yeah I still have the old version of the source running on pubsubhubbub.appspot.com; I need to update that with a more recent build. Sorry for the trouble! It's still not quite ready for widespread use, but it should be soon. The way pubsubhubbub does fan-out, there's no need to write an entity for each subscriber of a feed. Instead, each time it consumes a task from the work queue it will update the current iterator position in the query result of subscribers for a URL. Subsequent work requests will offset into the subscribers starting at the iterator position. This works well in this case because it's using urlfetch to actually notify subscribers, instead of writing to the Datastore. For other pub/sub-style systems where you want to write to the Datastore, the trick is to use list properties to track the subscribers you've published to. So for instance, instead of writing a single entity per subscriber, you write one entity with 1000-2000 subscriber IDs in a list. Then all queries for that list with an equals filter for the subscriber will show the entity. This lets you pack a lot of information into a single entity write, thus minimizing Datastore overhead, cost, etc. Does that make sense? @bFlood: Indeed, the async_apiproxy.py code is interesting. Not much to say about that at this time, besides the fact that it works. =) -Brett --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Google App Engine group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] xml2jsonp - Cross Site Service
This free app is useful to convert XML to JSON-P http://pyoohtml.appspot.com/xml2jsonp A similar project: http://www.thomasfrank.se/xml_to_json.html Comments: http://appgallery.appspot.com/about_app?app_id=agphcHBnYWxsZXJ5chQLEgxBcHBsaWNhdGlvbnMYpdgFDA Bye :-) --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Google App Engine group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: A question for Jaiku's developers, if they're watching..
I am really stoked by the fact that it is now open. I love seeing how other people develop software. Paul. 2009/3/14 peterk peter.ke...@gmail.com Just a head's up - Jaiku has gone open source :) http://code.google.com/p/jaikuengine/ At a very brief first glance, I see references to xmpp stuff and more..going to try and map out the code and see what goodies might be there, could be stuff of interest beyond pub/sub too. On Mar 13, 1:28 pm, peterk peter.ke...@gmail.com wrote: Unfortunately I do need to query them based on subscriber_id..so I can't pack them into a non-indexed property. Retrieving updates particular user has subscribed to is blazingly fast though...that's the gain in the end, I can query and fetch 1000 updates for a user sorted by date in 20-30ms-cpu. Love that :p In my hacky approaches previously where I tried to write once and then 'gather', I had to do lots of in-memory sorting and stuff, and even the results often wouldn't be totally accurate. I'm going to keep toying with the write end of things though..because in my full app, I may need to do write to other entities along with subscribers to do certain things I'm trying to achieve. So I'm going to be looking for every opportunity possible to optimise the cost of an 'update', which in my case may go beyond notifying subscribers. So any thoughts/ideas on further optimisation are more than welcome!! @Paul If you've more subscribers than will fit in one 'group' you'll need multiple groups, correct. So you'll have n writes, where n = number of subscribers/group-size, rounded up to the nearest whole number. Even with the costly index creation for each of these 'group' entities though, it should still work out a fair bit cheaper than writing a seperate entity for each subscriber. On Mar 13, 11:47 am, bFlood bflood...@gmail.com wrote: @peterk - if you don't need to query by the subscriber, you could alternatively pack the list of subscribers for a feed into a TextProperty so it is not indexed. I use TextProperty a lot to store large lists of geometry data and they work out pretty well @brett - async! looking forward to it in future GAE builds. thanks cheers brian On Mar 13, 5:37 am, peterk peter.ke...@gmail.com wrote: I was just toying around with this idea yesterday Brett.. :D I did some profiling, and it would reduce the write cost per subscriber to about 24ms-40ms (depending on the number of subscribers you have..more = lower cost per avg), from 100-150ms. These are rough numbers with entities I was using, I have to do some more accurate profiling.. When I first thought about doing this, I was thinking :o I'll reduce write cost by a factor of hundreds!, but as it turns out, the extra index update time for an entity with a large number of list property entries eats into that saving significantly. But it still is a saving. Funnily enough the per subscriber saving increases (to a point) the more subscribers you have. I'm not sure if there's anything one can do to optimise index creation time with large lists.. I'm going to do some more work as well to see if there's an optimum 'batch size' for grouping subscribers together..at first blush, as mentioned above, it seems the larger the better (up to the per entity property/index cap of course). Thanks also for the insight on pubsubhubub..I eagerly await updates on that front :) Thank you!! On Mar 13, 8:05 am, Paul Kinlan paul.kin...@gmail.com wrote: Just Curious, For other pub/sub-style systems where you want to write to the Datastore, the trick is to use list properties to track the subscribers you've published to. So for instance, instead of writing a single entity per subscriber, you write one entity with 1000-2000 subscriber IDs in a list. Then all queries for that list with an equals filter for the subscriber will show the entity. This lets you pack a lot of information into a single entity write, thus minimizing Datastore overhead, cost, etc. Does that make sense? So if you have over the 5000 limit in the subscribers would you write the entity twice? Each with differnt subscriber id's? Paul 2009/3/13 Brett Slatkin brett-appeng...@google.com Heyo, Good finds, peterk! pubsubhubbub uses some of the same techniques thatJaikuuses for doing one-to-many fan-out of status message updates. The migration is underway as we speak (http://www.jaiku.com/blog/2009/03/11/upcoming-service-break/). I believe the code should be available very soon. 2009/3/11 peterk peter.ke...@gmail.com: The app is actually live here: http://pubsubhubbub.appspot.com/ http://pubsubhubbub-subscriber.appspot.com/ (pubsubhubbub-publisher isn't there, but it's trivial to upload
[google-appengine] Re: Error on Live but works fine in SDK
It seems that I need to put a index for this to work! I have beefed up my exception handling to help me track this issue. David On Mar 12, 5:58 pm, Marzia Niccolai ma...@google.com wrote: Hi, What is the error that it throws? -Marzia On Thu, Mar 12, 2009 at 3:47 AM, David Burns tynecas...@gmail.com wrote: I am having a small issue with my app when its running up on the live environment. When I try store an object into the datastore it throws an error. Model class Story(db.Model): ''' This model will hold all the information for that the items that need to be worked on''' StoryName = db.StringProperty(required=True) StoryID = db.IntegerProperty() IWant = db.StringProperty() SoThat = db.StringProperty() WithScenario = db.TextProperty() # Scenarios will hold major test cases that need to be passed Status = db.StringProperty(required=True, choices=set ([New,InDev,InTest,Complete,Rejected])) # This will hold what state the item is in. created = db.DateTimeProperty() Owner = db.UserProperty(required=True) Priority = db.IntegerProperty() OrgRef = db.ReferenceProperty(Organisation) ProjectID = db.ReferenceProperty(Project) Code throwing the error def _getNextStoryID(self,orgKey): ''' This method collects the next available StoryID and returns it to its caller @Returns the next available number for the organisation ''' logging.debug('storystore._getNextStoryID: has been called') try: story_query = Story.all() story_query.filter('OrgRef = ', orgKey) story_query.order('-StoryID') story = story_query.get() if story is None: storyID = 1 else: storyID = story.StoryID + 1 return storyID except: raise Story ID could not be collected The line that is throwing the error seems to be story_query = Story.all () This all seems to be fine on my dev environment but when I deploy it I get errors thrown. I appreciate any help that people can offer! David --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Google App Engine group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: XML to json example?
See: http://appgallery.appspot.com/about_app?app_id=agphcHBnYWxsZXJ5chQLEgxBcHBsaWNhdGlvbnMYpdgFDA On Jan 30, 7:12 pm, Geuis geuis.te...@gmail.com wrote: Does anyone have a sampleXMLtojsonconverter they could share? Have tried building one but I have failed. --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Google App Engine group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: logging level
you can just do somthing like that in your main application file: def main(): logging.getLogger().setLevel(logging.DEBUG) run_wsgi_app(application) --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Google App Engine group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: help me about Django' url setting!!!!!!!
see: http://docs.djangoproject.com/en/dev/ref/middleware/#module-django.middleware.common http://www.djangosnippets.org/snippets/601/ in your settings.py: APPEND_SLASH = False On Mar 14, 5:01 am, gxtiou gxt...@gmail.com wrote: i don't want it to redirect to: http://www.abc.com/login/; added / in the end. --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Google App Engine group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: logging level
I was curious about logging load, as well, so I ran a few small tests. It looks like it costs around 0.25 milliseconds per log call on the production server. Which is not free but is very cheap. But I also noticed that the development server takes a dive if you log more than around 500 times in response to a single request. Haven't investigated why that might be but it did explain some puzzling hangs I had when debugging over-logged code. The production server has no such issues and happily logged as many as 10,000 times in a single request. The load time rose to around 0.4 milliseconds per log in this test, though. I was also duly chastised in the log entry for using so much CPU in a single request. I'm still puzzling over the CPU usage claims. This 10,000 log entries test seems to complete in around 0.786 seconds wall clock time (using Pythons time module to record start and end time of the request then taking the difference) but is charged 3948ms in CPU. Which seems odd. --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Google App Engine group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Twitter - rate limit exceeded
Interesting, I have a Twitter app (http://twendly.appspot.com) but I don't seem to be having this issue at the moment. However, while I read information every 5 minutes from the google search API (which is rate limited differently) I only send a few messages (no more than 5 or 6 max and usually only 4) as the hour clicks over. Although ocasionally this drops a message, it's generally pretty solid. Perhaps because of when I'm sending them, I get in at the start of the allocation. As far as scalability goes, I would say GAE is really suited for it's read scalability, so if unless your Twitter bot writes are going to massive, then scalability shouldn't be an issue if you move these writes over to a seperate host. I guess a (nasty but possible) pattern would be to have the Twitter interaction come from your host which could act as a proxy, then use App Engine for all the processing and reporting on the data. At least in my application this would be a potential work-around if this becomes an issue. Cheers Tim On Sat, Mar 14, 2009 at 3:57 PM, Richard Bremner richyr...@gmail.comwrote: Hmmm yes this is a difficult one. Neither Twitter nor Google are being unreasonable, and each GAE developer is probably performing a sane number of Twitter API requests but combined we are ruining it for everyone. Ohhh the solution? I can't think of a good solution Twitter could implement which wouldn't make it easy to circumvent their limit unreasonably. I do happen to have a hosted linux server a I can put a proxy script on, I guess I'm lucky there, but I am using GAE for its scaleability which my server certainly isn't. I don't need to go into all the reasons GAE is more scaleable than my own server :-) If anyone thinks of anything, I'd love to know. Rich 2009/3/14 lock lachlan.hu...@gmail.com Hmmm. My next app engine project _was_ going to be an app that relied on twitter. This doesn't sound good. As per your situation the app wouldn't hammer twitter, one request to the search API every 5-10 minutes or so. Given its not exactly an app engine problem did you try contacting twitter to see if they could build more 'smarts' into their rate limiting? Would be really interested to see if you end up resolving this issue, thanks for the heads up. Sorry I can't help. Cheers, lock On Mar 12, 10:43 pm, richyrich richyr...@gmail.com wrote: Hi there, I have been writing a simple little app that uses the Twitter API. It works perfectly on my local development server but it fails when I upload it because I get this error from Twitter: error=Rate limit exceeded. Clients may not make more than 100 requests per hour. ...even though my app only makes 1 request. what is happening is that other people apps must be using the Twitter API from the same IP address. does anyone know a good way around this other than hosting my app somewhere else? --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Google App Engine group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Uploading an applicaiton
Hello, I have uploaded my applicaiton an hour ago but don't know where to go from there. Tried to search for it on the web, but couldn't find it.my application is called quranytopics Any clues on what is going wrong or do I have to wait for more time before trying to access it? Thank you very much, Nora --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Google App Engine group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Twitter - rate limit exceeded
I could think of a rather simple solution for twitter: Just give everyone an API key that they use to authenticate, and make the limits based on the key instead of the IP addresses. No? On Mar 14, 12:57 am, Richard Bremner richyr...@gmail.com wrote: Hmmm yes this is a difficult one. Neither Twitter nor Google are being unreasonable, and each GAE developer is probably performing a sane number of Twitter API requests but combined we are ruining it for everyone. Ohhh the solution? I can't think of a good solution Twitter could implement which wouldn't make it easy to circumvent their limit unreasonably. I do happen to have a hosted linux server a I can put a proxy script on, I guess I'm lucky there, but I am using GAE for its scaleability which my server certainly isn't. I don't need to go into all the reasons GAE is more scaleable than my own server :-) If anyone thinks of anything, I'd love to know. Rich 2009/3/14 lock lachlan.hu...@gmail.com Hmmm. My next app engine project _was_ going to be an app that relied on twitter. This doesn't sound good. As per your situation the app wouldn't hammer twitter, one request to the search API every 5-10 minutes or so. Given its not exactly an app engine problem did you try contacting twitter to see if they could build more 'smarts' into their rate limiting? Would be really interested to see if you end up resolving this issue, thanks for the heads up. Sorry I can't help. Cheers, lock On Mar 12, 10:43 pm, richyrich richyr...@gmail.com wrote: Hi there, I have been writing a simple little app that uses the Twitter API. It works perfectly on my local development server but it fails when I upload it because I get this error from Twitter: error=Rate limit exceeded. Clients may not make more than 100 requests per hour. ...even though my app only makes 1 request. what is happening is that other people apps must be using the Twitter API from the same IP address. does anyone know a good way around this other than hosting my app somewhere else? --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Google App Engine group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Uploading an applicaiton
On Sat, Mar 14, 2009 at 7:32 PM, Nora noorhanab...@yahoo.co.uk wrote: Hello, I have uploaded my applicaiton an hour ago but don't know where to go from there. Tried to search for it on the web, but couldn't find it.my application is called quranytopics Then your app is located at http://quranytopics.appspot.com. Any clues on what is going wrong or do I have to wait for more time before trying to access it? It's available now, unfortunately, you'll need to debug it. :-) -- Faber Fedor Linux New Jersey http://linuxnj.com faberfedor.blogspot.com --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Google App Engine group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Twitter - rate limit exceeded
That would work, and some Twitter API calls are authenticated already, but I guess it would be easy to bypass the limit by registering for multiple keys and having your app vary which key it uses. Maybe that doesn't matter in the big picture though. rich 2009/3/15 MajorProgamming sefira...@gmail.com I could think of a rather simple solution for twitter: Just give everyone an API key that they use to authenticate, and make the limits based on the key instead of the IP addresses. No? On Mar 14, 12:57 am, Richard Bremner richyr...@gmail.com wrote: Hmmm yes this is a difficult one. Neither Twitter nor Google are being unreasonable, and each GAE developer is probably performing a sane number of Twitter API requests but combined we are ruining it for everyone. Ohhh the solution? I can't think of a good solution Twitter could implement which wouldn't make it easy to circumvent their limit unreasonably. I do happen to have a hosted linux server a I can put a proxy script on, I guess I'm lucky there, but I am using GAE for its scaleability which my server certainly isn't. I don't need to go into all the reasons GAE is more scaleable than my own server :-) If anyone thinks of anything, I'd love to know. Rich 2009/3/14 lock lachlan.hu...@gmail.com Hmmm. My next app engine project _was_ going to be an app that relied on twitter. This doesn't sound good. As per your situation the app wouldn't hammer twitter, one request to the search API every 5-10 minutes or so. Given its not exactly an app engine problem did you try contacting twitter to see if they could build more 'smarts' into their rate limiting? Would be really interested to see if you end up resolving this issue, thanks for the heads up. Sorry I can't help. Cheers, lock On Mar 12, 10:43 pm, richyrich richyr...@gmail.com wrote: Hi there, I have been writing a simple little app that uses the Twitter API. It works perfectly on my local development server but it fails when I upload it because I get this error from Twitter: error=Rate limit exceeded. Clients may not make more than 100 requests per hour. ...even though my app only makes 1 request. what is happening is that other people apps must be using the Twitter API from the same IP address. does anyone know a good way around this other than hosting my app somewhere else? --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Google App Engine group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Twitter - rate limit exceeded
Hi Tim, I presume GAE requests can come from a large number of different IP's. My app has been working fine for months, but then I made a small change, did an upload and it suddenly stopped working with this error, so I can only assume my app started coming from a different IP due to that upload, and the new IP had already made a lot of Twitter requests. Can the apps spontaneously change IP? I don't know enough about Google's infrastructure to diagnose. It could be that your app will be blocked the next time you upload it, that's what happened to me. Good luck with your upload roulette :-) rich 2009/3/15 Tim Bull tim.b...@binaryplex.com Interesting, I have a Twitter app (http://twendly.appspot.com) but I don't seem to be having this issue at the moment. However, while I read information every 5 minutes from the google search API (which is rate limited differently) I only send a few messages (no more than 5 or 6 max and usually only 4) as the hour clicks over. Although ocasionally this drops a message, it's generally pretty solid. Perhaps because of when I'm sending them, I get in at the start of the allocation. As far as scalability goes, I would say GAE is really suited for it's read scalability, so if unless your Twitter bot writes are going to massive, then scalability shouldn't be an issue if you move these writes over to a seperate host. I guess a (nasty but possible) pattern would be to have the Twitter interaction come from your host which could act as a proxy, then use App Engine for all the processing and reporting on the data. At least in my application this would be a potential work-around if this becomes an issue. Cheers Tim On Sat, Mar 14, 2009 at 3:57 PM, Richard Bremner richyr...@gmail.comwrote: Hmmm yes this is a difficult one. Neither Twitter nor Google are being unreasonable, and each GAE developer is probably performing a sane number of Twitter API requests but combined we are ruining it for everyone. Ohhh the solution? I can't think of a good solution Twitter could implement which wouldn't make it easy to circumvent their limit unreasonably. I do happen to have a hosted linux server a I can put a proxy script on, I guess I'm lucky there, but I am using GAE for its scaleability which my server certainly isn't. I don't need to go into all the reasons GAE is more scaleable than my own server :-) If anyone thinks of anything, I'd love to know. Rich 2009/3/14 lock lachlan.hu...@gmail.com Hmmm. My next app engine project _was_ going to be an app that relied on twitter. This doesn't sound good. As per your situation the app wouldn't hammer twitter, one request to the search API every 5-10 minutes or so. Given its not exactly an app engine problem did you try contacting twitter to see if they could build more 'smarts' into their rate limiting? Would be really interested to see if you end up resolving this issue, thanks for the heads up. Sorry I can't help. Cheers, lock On Mar 12, 10:43 pm, richyrich richyr...@gmail.com wrote: Hi there, I have been writing a simple little app that uses the Twitter API. It works perfectly on my local development server but it fails when I upload it because I get this error from Twitter: error=Rate limit exceeded. Clients may not make more than 100 requests per hour. ...even though my app only makes 1 request. what is happening is that other people apps must be using the Twitter API from the same IP address. does anyone know a good way around this other than hosting my app somewhere else? --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Google App Engine group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: SDK 1.1.9 breaks google-app-engine-django?
where can i get r72? the latest is just r52 http://code.google.com/p/google-app-engine-django/downloads/list, thanks for ur help On Mar 10, 9:36 pm, arbi arbin...@gmail.com wrote: Hi, I had exactly the same problem of importingdjangoerror. The warning I got is : [WARNING:root:Blocking access to skipped file /Users/brouard/ mysite/.google_appengine/lib/django/django/foo] Can Brett or Nuno help me? I have the 1.1.9 version of appengine. The SDK folder (google_appengine) is already in /usr/local. But I don't understand what Brett said : The fix should be not use theDjangohelper's little trick of keeping your SDK in .google_appengine but instead actually install it, or at least keep it outside of yourappdirectory. That should prevent the skipped file blocking from interfering with your imports. How can I fix this pb? (I am kind of newb) Thx Arbi --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Google App Engine group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Problem when storing a integer into a model(GQL)
Here is the model I'm trying to update: class ManifestVersion(db.Model): versionNumber = db.IntegerProperty() Here is the method that should change the only instance of the previous model: def incrementVersion(): versionList = ManifestVersion.all() num = versionList[0].versionNumber num = num+1 versionList[0].versionNumber = num versionList[0].put() return versionList[0].versionNumber But the versionNumber value is never updated and this function always returns 1. On the other hand num is changed in this call. What am I doing wrong? --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Google App Engine group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] minimal http response headers
Does anyone know what is the minimum amount of data that the appengine http response can return? I need the ***absolute*** minimum, counting both headers and data. Actually, I don't even plan to return any data, just 1 bit of information through (custom) status response codes like 201 (for OK) or 501 (for ERR). (as to why, short answer: GPRS transfer charged per byte, lots of requests) And since this is my custom app doing requests, I will not even use GET request, probably just HEAD, or even PUT (or anything that will help return the smallest amount of data). Also, any caching method (that could help) is possible for me (IfModified/ETag?). Doing a normal GET / HTTP/1.0 request on some appengine app, I get this response headers: HTTP/1.0 200 OK Content-Type: text/html; charset=utf-8 Cache-Control: no-cache Date: Sun, 15 Mar 2009 05:00:48 GMT Server: Google Frontend Content-Length: 2906 Connection: Close I can see from appengine documentation that you can't modify Content- Length, Date and Server fields, so I suppose they can't be disabled either. Is there a way to disable them, possibly using custom HTTP commands (HEAD/PUT?)? Or just a simple redirection maybe? Are all those headers sent back when your script does a simple HTTP 302 redirect? Thanks in advance Tomislav Jovanovic --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Google App Engine group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Twitter - rate limit exceeded
Hi Tim, Just had a look at Twendly, looks good! I've just got a few quick questions, if you wouldn't mind... 1. By 'google search API' you actually mean 'twitter seach API', yeah ? ;-) 2. How do you go about pulling data from twitter every 5 minutes? Unless I'm missing something there are no scheduled tasks in app engine (yet). Using a cron job on another server to call a special URL maybe? The API key sounds like the proper solution, would be nice if there was a solution now though. Just an idea that probably won't work for most cases. Get the client (via javascript) to pull data from twitter and send it on to app engine for processing/storage. Not real pretty. Thanks, lock On Mar 15, 9:16 am, Tim Bull tim.b...@binaryplex.com wrote: Interesting, I have a Twitter app (http://twendly.appspot.com) but I don't seem to be having this issue at the moment. However, while I read information every 5 minutes from the google search API (which is rate limited differently) I only send a few messages (no more than 5 or 6 max and usually only 4) as the hour clicks over. Although ocasionally this drops a message, it's generally pretty solid. Perhaps because of when I'm sending them, I get in at the start of the allocation. As far as scalability goes, I would say GAE is really suited for it's read scalability, so if unless your Twitter bot writes are going to massive, then scalability shouldn't be an issue if you move these writes over to a seperate host. I guess a (nasty but possible) pattern would be to have the Twitter interaction come from your host which could act as a proxy, then use App Engine for all the processing and reporting on the data. At least in my application this would be a potential work-around if this becomes an issue. Cheers Tim On Sat, Mar 14, 2009 at 3:57 PM, Richard Bremner richyr...@gmail.comwrote: Hmmm yes this is a difficult one. Neither Twitter nor Google are being unreasonable, and each GAE developer is probably performing a sane number of Twitter API requests but combined we are ruining it for everyone. Ohhh the solution? I can't think of a good solution Twitter could implement which wouldn't make it easy to circumvent their limit unreasonably. I do happen to have a hosted linux server a I can put a proxy script on, I guess I'm lucky there, but I am using GAE for its scaleability which my server certainly isn't. I don't need to go into all the reasons GAE is more scaleable than my own server :-) If anyone thinks of anything, I'd love to know. Rich 2009/3/14 lock lachlan.hu...@gmail.com Hmmm. My next app engine project _was_ going to be an app that relied on twitter. This doesn't sound good. As per your situation the app wouldn't hammer twitter, one request to the search API every 5-10 minutes or so. Given its not exactly an app engine problem did you try contacting twitter to see if they could build more 'smarts' into their rate limiting? Would be really interested to see if you end up resolving this issue, thanks for the heads up. Sorry I can't help. Cheers, lock On Mar 12, 10:43 pm, richyrich richyr...@gmail.com wrote: Hi there, I have been writing a simple little app that uses the Twitter API. It works perfectly on my local development server but it fails when I upload it because I get this error from Twitter: error=Rate limit exceeded. Clients may not make more than 100 requests per hour. ...even though my app only makes 1 request. what is happening is that other people apps must be using the Twitter API from the same IP address. does anyone know a good way around this other than hosting my app somewhere else? --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Google App Engine group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---