[google-appengine] Re: Alternate Upload Tools...
there is probably no alternative method. work on fixing the appcfg.py method. On Jan 10, 8:20 am, ramu wrote: > First of all, thanks for this great product. Now I feel myself *how > much* time I wasted looking for a free reliable PHP host since last 2 > month. It was 2 days back when I entered "data api" in google.co.in > and came across this solution from google. I have been reading and > reading and reading than since on different help foram and discussion > group... ( even found gminifb.py , the facebookAPI handler created for > App Engine Python ) . I ran the SDK and followed each page of getting > started. O yes, the 1st reading about Python on wiki. Finally the > moment came I wanted to upload the test app of *getting started* > section. Whooshhh IT FAILED. Again some googling on net and realised I > need some python configuration to upload through the AUTHENTICATED > proxy I use for internet connection.Tried to the limit with no > success. > > Now all I want is ANY alternate tool to upload my files to Google App > Engine. Their count is well below 100 so no problem uploading > individually. > > IS THERE any alternate method ( I couldn't google about ) for doing > so.??? > > Ram Shanker --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Restrict entire app access by IP? (for stage server)
On Fri, Jan 9, 2009, Andrew Yates wrote: > > Hi, is there a way to deny/allow IPs for in-house, private staging > servers on App Engine like one could create in Apache in the config > or .htaccess settings? I think there isn't. See also issue 644: http://code.google.com/p/googleappengine/issues/detail?id=644. > > If not, what is the best way to create a private, company staging > server for testing? You can just check the client's IP address in your scripts and always return a 403 Forbidden response unless the IP matches a predefined white list. -- Alexander --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] External database dump
I would like to use GAE with the data created and maintained in a different application in a different server. My solution was to generate an xml file with all the data and parse it to create/update the GAE related entities. Clearly this is not a CPU friendly solution (especially considering that fetch operations are considered as CPU operations!) and I get a nice "Dude, this is whack!" message with errors and CPU quota warning in the logs. I will try to copy the xml MANUALLY in an input field to get rid of the errors (this procedure looks whack to me!), but since the entities to update are thousands, I have the feeling that won't be enough. Is there any other recommended way to work with large data without creating CPU issues? I don't know using time.sleep() to give the CPU a break? It seems the CPU errors are pretty common, probably Google should give more info about how to avoid them. Thanks, chr --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: External database dump
I guess you have to split your XML file and work with it by steps. Did you see how bulkloader works? On Jan 11, 2:04 pm, gabon wrote: > I would like to use GAE with the data created and maintained in a > different application in a different server. My solution was to > generate an xml file with all the data and parse it to create/update > the GAE related entities. > Clearly this is not a CPU friendly solution (especially considering > that fetch operations are considered as CPU operations!) and I get a > nice "Dude, this is whack!" message with errors and CPU quota warning > in the logs. I will try to copy the xml MANUALLY in an input field to > get rid of the errors (this procedure looks whack to me!), but since > the entities to update are thousands, I have the feeling that won't be > enough. > > Is there any other recommended way to work with large data without > creating CPU issues? I don't know using time.sleep() to give the CPU a > break? It seems the CPU errors are pretty common, probably Google > should give more info about how to avoid them. > > Thanks, chr --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: New Project : PHPGeoCache For Local Storage and Google App Engine
Hi, There is a change in the name of project from PHPGeoCache to PHPGeoTiles. New Address is as http://www.geowebdeveloper.com/phpgeotiles/ By the way, the version is also updated to 0.2 with new feature : ArcGIS Server REST API to Google App Engine :) Sorry for the conflict. Thanks. Alper Dincer. http://www.geowebdeveloper.com On Jan 11, 1:41 am, Alper wrote: > Hi, > > I have just released a new project named "PHPGeoCache", which is a > tile proxy for WMS like TileCache or GeoWebCache written in PHP. The > relation of the project and the group is that Google App Engine can be > used as a tile storage. > > You can easily use it with Google Maps API, MS Virtual Earth or > OpenLayers projects. > > I'm very glad to hear your comments, questions or bug reports about > the project. > > Thanks. > > Alper Dincer.http://www.geowebdeveloper.com > > Blog entry about the project > :http://www.geowebdeveloper.com/2009/01/11/phpgeocache-for-google-app-... > Project page :http://www.geowebdeveloper.com/phpgeocache/ --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: External database dump
I started optimizing everywhere. Now I have a textarea input where I paste a json string (I presume is faster to parse than xml). I still got problems. Is it really the solution to split the operation in more manual steps? Is it not possible to have an automatized and longer process, giving for instance breaks to the CPU with time.sleep() ? How could I split in an automatic way the update of thousands of entities? I am thinking on redirecting the page to a new url passing the data to update and every time processing some. It sounds pretty crazy, but if the limit is the time to generate a page I don't see other solutions. With bulkloader, you mean the actionscript 3 library? Thanks, chr On Jan 11, 12:18 pm, Greg Temchenko wrote: > I guess you have to split your XML file and work with it by steps. > Did you see how bulkloader works? > > On Jan 11, 2:04 pm, gabon wrote: > > > I would like to use GAE with the data created and maintained in a > > different application in a different server. My solution was to > > generate an xml file with all the data and parse it to create/update > > the GAE related entities. > > Clearly this is not a CPU friendly solution (especially considering > > that fetch operations are considered as CPU operations!) and I get a > > nice "Dude, this is whack!" message with errors and CPU quota warning > > in the logs. I will try to copy the xml MANUALLY in an input field to > > get rid of the errors (this procedure looks whack to me!), but since > > the entities to update are thousands, I have the feeling that won't be > > enough. > > > Is there any other recommended way to work with large data without > > creating CPU issues? I don't know using time.sleep() to give the CPU a > > break? It seems the CPU errors are pretty common, probably Google > > should give more info about how to avoid them. > > > Thanks, chr --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] How to upload APP through an *proxy server with basic authentication* .
Dear Google App Engine Developer , I am halfway completing my real app but can't get a way to upload this basic learning "hello world" app. Please Help Me. I am ataching the error logs as screen shoot. Thanks in advance. Error : http://picasaweb.google.com/lh/photo/7mWe7DzlYAH88v1op55Z6Q?feat=directlink --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: External database dump
I think one other wired way of achieving this will be to go for a javascript Ajax solution. Let the javascript in browser do the xml parsing and hit appengine for each entities via an Ajax request. in this case you have the control of including a wait period between your calls. cheeers gipsy On Sun, Jan 11, 2009 at 6:45 AM, gabon wrote: > > I started optimizing everywhere. Now I have a textarea input where I > paste a json string (I presume is faster to parse than xml). I still > got problems. > > Is it really the solution to split the operation in more manual steps? > Is it not possible to have an automatized and longer process, giving > for instance breaks to the CPU with time.sleep() ? > > How could I split in an automatic way the update of thousands of > entities? I am thinking on redirecting the page to a new url passing > the data to update and every time processing some. It sounds pretty > crazy, but if the limit is the time to generate a page I don't see > other solutions. > > With bulkloader, you mean the actionscript 3 library? > > > Thanks, chr > > On Jan 11, 12:18 pm, Greg Temchenko wrote: > > I guess you have to split your XML file and work with it by steps. > > Did you see how bulkloader works? > > > > On Jan 11, 2:04 pm, gabon wrote: > > > > > I would like to use GAE with the data created and maintained in a > > > different application in a different server. My solution was to > > > generate an xml file with all the data and parse it to create/update > > > the GAE related entities. > > > Clearly this is not a CPU friendly solution (especially considering > > > that fetch operations are considered as CPU operations!) and I get a > > > nice "Dude, this is whack!" message with errors and CPU quota warning > > > in the logs. I will try to copy the xml MANUALLY in an input field to > > > get rid of the errors (this procedure looks whack to me!), but since > > > the entities to update are thousands, I have the feeling that won't be > > > enough. > > > > > Is there any other recommended way to work with large data without > > > creating CPU issues? I don't know using time.sleep() to give the CPU a > > > break? It seems the CPU errors are pretty common, probably Google > > > should give more info about how to avoid them. > > > > > Thanks, chr > > > -- cheers Gipsy --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: App Engine Deep Zoom of 100 Megapixel image
Fun! Good work On Jan 10, 2:19 am, antsyawn wrote: > I thought I'd post a mashup I put together using App Engine. It uses > Microsoft Seadragon and Lovepixel's 1x1 cityscape. > > The Seadragon data weighs in at 93mb and is composed of over 2000 > files. > > I recommend viewing with Chrome, since it's by far the fastest for > this site. > > http://lovepixelzoom.appspot.com --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: External database dump
Bulkloader is a GAE util http://code.google.com/appengine/articles/bulkload.html It reads CSV files but maybe you can rewrite it to read XML files 2009/1/11 gabon : > With bulkloader, you mean the actionscript 3 library? --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: External database dump
I mean GAE data uploader. You can read here: http://code.google.com/appengine/articles/bulkload.html It splits a csv file into 10 lines requests and inserts it step by step. On Jan 11, 3:45 pm, gabon wrote: > I started optimizing everywhere. Now I have a textarea input where I > paste a json string (I presume is faster to parse than xml). I still > got problems. > > Is it really the solution to split the operation in more manual steps? > Is it not possible to have an automatized and longer process, giving > for instance breaks to the CPU with time.sleep() ? > > How could I split in an automatic way the update of thousands of > entities? I am thinking on redirecting the page to a new url passing > the data to update and every time processing some. It sounds pretty > crazy, but if the limit is the time to generate a page I don't see > other solutions. > > With bulkloader, you mean the actionscript 3 library? > > Thanks, chr > > On Jan 11, 12:18 pm, Greg Temchenko wrote: > > > I guess you have to split your XML file and work with it by steps. > > Did you see how bulkloader works? > > > On Jan 11, 2:04 pm, gabon wrote: > > > > I would like to use GAE with the data created and maintained in a > > > different application in a different server. My solution was to > > > generate an xml file with all the data and parse it to create/update > > > the GAE related entities. > > > Clearly this is not a CPU friendly solution (especially considering > > > that fetch operations are considered as CPU operations!) and I get a > > > nice "Dude, this is whack!" message with errors and CPU quota warning > > > in the logs. I will try to copy the xml MANUALLY in an input field to > > > get rid of the errors (this procedure looks whack to me!), but since > > > the entities to update are thousands, I have the feeling that won't be > > > enough. > > > > Is there any other recommended way to work with large data without > > > creating CPU issues? I don't know using time.sleep() to give the CPU a > > > break? It seems the CPU errors are pretty common, probably Google > > > should give more info about how to avoid them. > > > > Thanks, chr --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Google Frontend server and multimedia content for iPhone
I'm trying to stream a small .mp4 movie from my app to an iPhone, and can't avoid getting a "This Movie Could Not Be Played" error in the iPhone movie player. This is by simply providing an HTML link to the content (I've also tried embedding using "" tags with no more success). I've narrowed it down to something (not sure what) with what's being served by the Google Frontend webserver. My findings: - There is no question about the movie being iPhone-compatible. Not only does it play in the iPhone's iPod app after synching with iTunes, but it will play fine from iPhone Safari if I host the exact same file on an IIS web server (more on that below). - Behavior is the same whether I use a static .mp4 file (size is less than 1MB) or serve raw data from a request handler in my Python code from a blob object. The static .mp4 is just for testing purposes; in production I'll be using the latter method, which also seems to give me more flexibility in setting response headers anyway (once again, more below). - The media content, whether static or served from the blob field, serves fine to Safari on a good old Mac desktop (or on any desktop browser I've tried). - I've cleared my cache in iPhone Safari as one last sanity check. Now, as for web servers and response headers. As stated above, everything works if I host the static file on an IIS server. According to http://www.askapache.com/online-tools/http-headers-tool/, the response headers from IIS are as follows: HTTP/1.1 200 OK Date: Sun, 11 Jan 2009 14:17:24 GMT Server: Microsoft-IIS/5.0 X-Powered-By: ASP.NET Content-Type: video/mp4 Accept-Ranges: bytes Last-Modified: Sun, 11 Jan 2009 02:22:18 GMT ETag: "029d46c9373c91:1a9f" Content-Length: 749736 Via: 1.1 Connection: close The response headers for the static file from Google Frontend are: HTTP/1.1 200 OK Date: Sun, 11 Jan 2009 15:25:50 GMT Expires: Tue, 10 Feb 2009 15:25:50 GMT Content-Type: video/mp4 Server: Google Frontend Content-Length: 749736 Connection: Close Serving the data directly from a request handler, I can modify the response headers a little better, but still the same behavior on the iPhone: HTTP/1.1 200 OK Content-Type: video/mp4 Accept-Ranges: bytes Cache-Control: none (also tried "public", "private", and setting a high max-age value) Date: Sun, 11 Jan 2009 15:01:03 GMT Server: Google Frontend Content-Length: 749736 Connection: Close This is as close as I can get the response headers to what I get from the IIS server, and whatever differences there are don't seem that relevant as far as I can tell. I could just as easily post this on an Apple forum to find out just what it is that makes the iPhone so fussy, but wondering if anyone intimately familiar with Google Frontend and serving media files might know a little bit about what's going on behind the scenes that could cause this. --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] BadRequestError: offset may not be above 4000
I wrote a cronjob that goes through and recalculates my members' ranks 10 at a time using the the following request: profiles = Profile.all().order('-score').fetch(10, offset) for profile in profiles: # calculate rank but once I get to an offset of more than 4000 I get this: BadRequestError: offset may not be above 4000 Google Team: Why is this limit applied? Otherwise I have to construct a seemingly less efficient query like this: mod_rank = int(offset % interval) profiles = Profile.all().filter('score <', rank_score).order('- score').fetch(10, mod_rank) if offset % interval == 0: memcache.set('rank_score', profile.score) --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: External database dump
Well, Bulk Data Uploader seems definitely very useful. What if I would like to update the records? (every now and then) Thanks, chr On Jan 11, 3:31 pm, Greg Temchenko wrote: > I mean GAE data uploader. You can read > here:http://code.google.com/appengine/articles/bulkload.html > > It splits a csv file into 10 lines requests and inserts it step by > step. > > On Jan 11, 3:45 pm, gabon wrote: > > > I started optimizing everywhere. Now I have a textarea input where I > > paste a json string (I presume is faster to parse than xml). I still > > got problems. > > > Is it really the solution to split the operation in more manual steps? > > Is it not possible to have an automatized and longer process, giving > > for instance breaks to the CPU with time.sleep() ? > > > How could I split in an automatic way the update of thousands of > > entities? I am thinking on redirecting the page to a new url passing > > the data to update and every time processing some. It sounds pretty > > crazy, but if the limit is the time to generate a page I don't see > > other solutions. > > > With bulkloader, you mean the actionscript 3 library? > > > Thanks, chr > > > On Jan 11, 12:18 pm, Greg Temchenko wrote: > > > > I guess you have to split your XML file and work with it by steps. > > > Did you see how bulkloader works? > > > > On Jan 11, 2:04 pm, gabon wrote: > > > > > I would like to use GAE with the data created and maintained in a > > > > different application in a different server. My solution was to > > > > generate an xml file with all the data and parse it to create/update > > > > the GAE related entities. > > > > Clearly this is not a CPU friendly solution (especially considering > > > > that fetch operations are considered as CPU operations!) and I get a > > > > nice "Dude, this is whack!" message with errors and CPU quota warning > > > > in the logs. I will try to copy the xml MANUALLY in an input field to > > > > get rid of the errors (this procedure looks whack to me!), but since > > > > the entities to update are thousands, I have the feeling that won't be > > > > enough. > > > > > Is there any other recommended way to work with large data without > > > > creating CPU issues? I don't know using time.sleep() to give the CPU a > > > > break? It seems the CPU errors are pretty common, probably Google > > > > should give more info about how to avoid them. > > > > > Thanks, chr --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: polish locale in app engine
On Sat, Jan 10, 2009 at 10:29 AM, Greg Temchenko wrote: > > I have the same trouble with russian locale. I tried to use 'ru', > 'ru_RU', and 'ru_RU.UTF-8' and everytime I get the message "locale not > supported". > I have no idea. It seems GAE supports locale not well... > > On Dec 31 2008, 4:32 am, konryd wrote: >> Hello, >> >> I'm having huge trouble trying to sort a list of string with respect >> to polish locale. It seems there is no such locale installed in GAE >> servers (I'm getting an error: "locale not supported"). Do you have a >> list of all locales available in GAE (I need 'pl_PL.UTF-8')? Sorry for the trouble. App Engine does not support any locale besides the default POSIX "c" locale. The reason is that the locale modules used by Python are operating system dependent and vary greatly from computer to computer. It would be difficult to provide a consistent view of all available locales across developer machines and our Google servers. Please file an issue on the issue tracker if you would like to see this addressed (though I'm not sure what the priority would be): http://code.google.com/p/googleappengine/issues/list However, keep in mind that it should not be too difficult to write your own simple functions to handle your locale properly. I realize this isn't ideal, but it should help you move forward! -Brett --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Weird problem with Reportlab + ImageReader + urlfetch
Hi, I spent a lot of time trying to figure out ways to embed images in pdf , I tried all sorts of stuff and hacks but stuck at this problem. 1.) I want to embed a google chart http://chart.apis.google.com/chart?cht=p3&chd=t:60,40&chs=250x100&chl=Hello|World 2.) hence I do: graphtoembed = urlfetch.fetch("http:// chart.apis.google.com/blahblah") 3.) problem is, Reportlab's drawImage requires a file name to embed this image or can take ImageReader object (which in turn asks for localfile name in constructor) 4.) How should I pass image content read from urlfetch to reportlab while I am not allowed to dynamically create/store files in app engine? Even just for a test if I do this for static images, path = os.path.join(os.path.dirname(__file__), 'static/images/ test.gif') img_obj = ImageReader(path) canvas.drawImage(img_obj, 200,800) I get this Cannot open resource "/base/data/home/apps/onepageresume [blahblah]" Please show me direction... The goal is simple , just to retrieve images from google charts and embed in a pdf... --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Googe App Engine and Friend Connect...
Hi benzrad, Go ahead adding that code to your app.yaml. It works for my site on GAE. Few more informations: * I put the canvas.html and rpc_relay.html on %AppHome%/media/ folder * Add the following code at the begining of app.yaml. Make sure those handles are registered **before** other python/django handlers - url: /canvas.html static_files: media/canvas.html upload: media/canvas.html - url: /rpc_relay.html static_files: media/rpc_relay.html upload: media/rpc_relay.html * test on your local server. It should work. Then upload to GAE to check the live gadget On Jan 4, 10:12 am, benzrad wrote: > i tried the code, but it don't work on my app athttp://app21zh.appspot.com > .in friend connect in the process to setup the site, it still reported > can't find the 2 files, which i had place all over including static > folder. > i need more instruction.TIA. > > On Jan 1, 3:56 pm, "Shalin Shekhar Mangar" > wrote: > > > You can add them inside the "static" directory. You must also map them to > > '/' (root) by adding the following in your app.yaml file. > > > - url: /canvas.html > > static_files: static/canvas.html > > upload: static/canvas.html > > > - url: /rpc_relay.html > > static_files: static/rpc_relay.html > > upload: static/rpc_relay.html > > > Hope that helps. > > > On Mon, Dec 29, 2008 at 7:21 AM, benzrad wrote: > > > > dear sir, > > > i also want to add google friend connect onto my GAE, but where > > > directory should i place the 2 file google asked for uploading, ie., > > > canvas.html&rpc_relay.html ? i place them on the root folder of my > > > app, also place them in the static folder with other html files, but > > > neither working, ie, the setup process of google friend connect can't > > > find them. where and how to setup in the py file to let the 2 html > > > files can be found on root folder? i absolutely bland on python code, > > > but i try to learn.p help me out. > > > -- > > Regards, > > Shalin Shekhar Mangar. --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Support for Decimal in Datastore?
If you're adding a bunch of these up, it would be unneeded overhead. Those building apps that handle financial data or other numerics that require precision need the decimal data type. On Dec 5 2008, 7:25 am, Justin wrote: > Why not store the value as a string in the datastore and convert it to > adecimalin Python on retrieval? You could use a property on your > model to do it for you. > > importdecimal > > class MyModel(db.Model): > string_amount = db.StringProperty(multiline=False) > > def get_amount(self): > returndecimal.Decimal(self.string_amount) > > def set_amount(self, value): > self.string_amount = str(value) > > amount = property(get_amount, set_amount) > > - Justin > > On Dec 5, 3:11 am, "Fred Janon" wrote: > > > Float andDecimalare two different types. Float is used in scientific > > calculations,Decimalfor accounting/financial applications.Decimalexists > > in Python but doesn't seem to be implemented in the Datastore. > > > Fred > > > On Fri, Dec 5, 2008 at 14:58, lock wrote: > > > > There is the FloatProperty, pretty sure that's what your after. > > > >http://code.google.com/appengine/docs/datastore/typesandpropertyclass... --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] How can I edit Entity by query use key.id?
Hi: In PHP: get the id like http://...?id=2,and: mysql_query(update Note set coulm='value' where id='$_GET[id]') In Google app engine: step 1: id=int(self.request.get('id')) #get ID step 2: (does not work!) - query = db.GqlQuery("SELECT * FROM Note WHERE key = :1", id) #get Entity by 'GET method ID'? - step 3: for c in query: c.colum = self.request.get('value') . db.put(query) Is anyone help me? Thanks a lot! --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] PostalAddressProperty
Not sure if I am understanding the datastore model correctly . I am looking to store, address, zip, county in a model and then be able to query by say zip or city or county. Can just one field with the PostalAddressProperty be used or do I need to duplicate data .. given I am not sure if PostalAddressProperty allows for county and also if I have to filter how do I filter on a field with all things in it? address = db.PostalAddressProperty() city = db.StringProperty(multiline=False) state = db.StringProperty(multiline=False) zipCode = db.IntegerProperty() county = db.StringProperty(multiline=False) Any suggestions on approach? --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] The Chinese SDK version is 1.1.0 ,WHY?
I download the Google App Engine SDK from google, the Chinsese version is 1.1.0 but the English version is 1.1.7 , Just why? http://code.google.com/intl/zh-CN/appengine/downloads.html Windows 1.1.0 - 5/28/08 GoogleAppEngine_1.1.0.msi 2.5 MB e0c0bc69e8005fbf338ef40ea569f890b25ea011 But, http://code.google.com/intl/en/appengine/downloads.html PlatformVersion Package SizeSHA1 Checksum Windows 1.1.7 - 11/21/08GoogleAppEngine_1.1.7.msi 2.6 MB 26049db14b41e87b7b80a769cd479ad1e06824b1 --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Can not delete entity using Data Viewer and a GQL query.
There was a bug in my app and some bad data got into the Datastore. So I want to delete some data. The models have a number of items so rather than paging through them all 20 at a time to find it I ran the GQL query: SELECT * FROM UserUpdates WHERE windAngle=328 This returns one row as expected. I then click on the tickbox beside it and click on Delete. I then get a dialog box asking if I am sure to which I say yes. I then get the error displayed in a red box at the top saying "The URL to forward to once the request is fulfilled" - Yes the error is a partial sentence and makes no sense. The item is NOT deleted. This is happening to all 4 of my models. However 2 of them only have a few hundred entries, so I was able to page through, find the item and delete it. When I paged through, the deleting was successful. I think that there must be a bug in the data viewer when removing data found using a gql query. Thanks --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: BadRequestError: offset may not be above 4000
The offset is applied _after_ the query is executed and the entities are fetched. Re-read the "Executing the Query and Accessing Results" section from http://code.google.com/appengine/docs/datastore/creatinggettinganddeletingdata.html So, using offset to go through all your entities is not a good idea. You should use a __key__ query instead. Read the "Queries on Keys" section from http://code.google.com/appengine/docs/datastore/queriesandindexes.html for details, it even has a complete code sample. Hope this helps. Cheers, Alex -- www.muspy.com On Jan 12, 4:12 am, wsstefan wrote: > I wrote a cronjob that goes through and recalculates my members' ranks > 10 at a time using the the following request: > > profiles = Profile.all().order('-score').fetch(10, offset) > > for profile in profiles: > # calculate rank > > but once I get to an offset of more than 4000 I get this: > > BadRequestError: offset may not be above 4000 > > Google Team: Why is this limit applied? Otherwise I have to > construct a seemingly less efficient query like this: > > mod_rank = int(offset % interval) > profiles = Profile.all().filter('score <', rank_score).order('- > score').fetch(10, mod_rank) > if offset % interval == 0: > memcache.set('rank_score', profile.score) --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: How can I edit Entity by query use key.id?
Step 2: entity = Note.get_by_id(id) which is actually a shortcut for: entity = Note.get(db.Key.from_path('Note', id)) or, if you insist on using GQL: entity = Note.gql('WHERE __key__ = :1', db.Key.from_path('Note', id)).get() ...but it's less efficient. Step 3: entity.column = my_value entity.put() You don't have to iterate on query results if you expect to get exactly one entity. Hope this helps. Cheers, Alex -- www.muspy.com On Jan 11, 5:16 pm, BigMouth Li wrote: > Hi: > > In PHP: > get the id likehttp://...?id=2,and: > mysql_query(update Note set coulm='value' where id='$_GET[id]') > > In Google app engine: > step 1: > id=int(self.request.get('id')) #get ID > > step 2: (does not work!) > - > query = db.GqlQuery("SELECT * FROM Note WHERE key = :1", id) #get > Entity by 'GET method ID'? > - > > step 3: > for c in query: > c.colum = self.request.get('value') > . > db.put(query) > > Is anyone help me? Thanks a lot! --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: user login URLs in html/javascript
Gipsy, Yes I finally got the signout link, also. It was tricky because it never appeared in the address bar, so I had to read it from the status bar and transcribe it manually into code. For reference sake this is the link I got. I think anyone could use it but with their own appspot.com link substituted. http://carpoolfinder.appspot.com/_ah/logout?continue=https://www.google.com/accounts/Logout%3Fcontinue%3Dhttp://carpoolfinder.appspot.com/%26service%3Dah Thank you, again for your help. On Jan 10, 11:05 pm, "Gipsy Gopinathan" wrote: > go tohttp://dpaste.com/107358/ > > You should be able create the logout link by just changing it to > > users.create_logout_url("/") instead of users.create_login_url("/") > > On Sat, Jan 10, 2009 at 9:43 PM, thebrianschott wrote: > Brian in Atlanta --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Creating a short unique URL for an entity
Just create a random n-character string for each document. Do that in put() before the entity is saved for the first time. Choose n based on what risk of collision (two documents having the same randomly generated slug) is practical. I say n=6 is pretty good. On Jan 10, 10:24 am, Jesse Grosjean wrote: > I'm relatively new to web programming, so don't feel back about > telling me the obvious :) > > I have a simple model consisting of Account entities that can own many > document entities. I'd like to create a short as possible, globally > unique URL scheme that I can use to access any document. For testing > right now I'm just using the documents .key() in the URL. So for > example my urls look something like this: > > /documents/ > ag13cml0ZXJvb20tY29tchwLEgdBY2NvdW50GH4MCxIIRG9jdW1lbnQYiQEM > > But that's really long. I'm now trying to figure out the easiest way > to make it shorter. One option that I'm considering is to just put the > account id and document id in the URL separated by a dash like this: > > /documents/126-137 > > And then I can reconstruct the Key like this: > > db.Key.from_path('Account', account_key, 'Document', document_key) > > I haven't actually tried this yet... but I think it should work? > > My goal is a unique, short, permanent URL for every document in the > system. If I'm going about this the wrong way or doing something dumb > please let me know. > > Thanks, > Jesse --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Announcing: System Status Dashboard, Quota Details Page, and a Preview of Billing
What about memcache? I could not find prices (or even quotas) for memcache API. On Jan 8, 2:00 pm, Marzia Niccolai wrote: > Hi, > > First, regarding Google Checkout - this will be the only method of payment > available when billing launches, however, we are still committed to > supporting our developers who live in the few countries where Google > Checkout is not available. For developers located in those countries, quotas > for particular applications may be raised. Decisions will be made strictly > on a case-by-case basis and may take up to two weeks > > The pricing upon launch of billing can be found > here:http://googleappengine.blogspot.com/2008/05/announcing-open-signups-e... > > -Marzia > > On Thu, Jan 8, 2009 at 9:44 AM, mclovin wrote: > > > Am i missing something somewhere? I cant seem to find any pricing for > > commericial applications even though google advertises app engine as a > > solution for commericial deployment. --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Creating a short unique URL for an entity
Why worry about risk of collision when you can generate relatively short urls using the Kind string and the id of the entity? Here's one of the first recipes in the cookbook: http://appengine-cookbook.appspot.com/recipe/mapping-keys-to-urls On Jan 11, 5:21 pm, Mahmoud wrote: > Just create a random n-character string for each document. Do that in > put() before the entity is saved for the first time. Choose n based on > what risk of collision (two documents having the same randomly > generated slug) is practical. I say n=6 is pretty good. > > On Jan 10, 10:24 am, Jesse Grosjean wrote: > > > I'm relatively new to web programming, so don't feel back about > > telling me the obvious :) > > > I have a simple model consisting of Account entities that can own many > > document entities. I'd like to create a short as possible, globally > > unique URL scheme that I can use to access any document. For testing > > right now I'm just using the documents .key() in the URL. So for > > example my urls look something like this: > > > /documents/ > > ag13cml0ZXJvb20tY29tchwLEgdBY2NvdW50GH4MCxIIRG9jdW1lbnQYiQEM > > > But that's really long. I'm now trying to figure out the easiest way > > to make it shorter. One option that I'm considering is to just put the > > account id and document id in the URL separated by a dash like this: > > > /documents/126-137 > > > And then I can reconstruct the Key like this: > > > db.Key.from_path('Account', account_key, 'Document', document_key) > > > I haven't actually tried this yet... but I think it should work? > > > My goal is a unique, short, permanent URL for every document in the > > system. If I'm going about this the wrong way or doing something dumb > > please let me know. > > > Thanks, > > Jesse --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: user login URLs in html/javascript
I am gald that you have resolved it. But still still wondering why you have to hard code these links On Sun, Jan 11, 2009 at 4:48 PM, thebrianschott wrote: > > Gipsy, > > Yes I finally got the signout link, also. It was tricky because it > never appeared in the address bar, so I had to read it from the status > bar and transcribe it manually into code. For reference sake this is > the link I got. I think anyone could use it but with their own > appspot.com link substituted. > > > http://carpoolfinder.appspot.com/_ah/logout?continue=https://www.google.com/accounts/Logout%3Fcontinue%3Dhttp://carpoolfinder.appspot.com/%26service%3Dah > > Thank you, again for your help. > > On Jan 10, 11:05 pm, "Gipsy Gopinathan" wrote: > > go tohttp://dpaste.com/107358/ > > > > You should be able create the logout link by just changing it to > > > > users.create_logout_url("/") instead of users.create_login_url("/") > > > > On Sat, Jan 10, 2009 at 9:43 PM, thebrianschott >wrote: > > > > > Brian in Atlanta > > > > -- cheers Gipsy --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] versions share db?
It appears that different versions of an app share the db. Do they share other stuff? Is it possible to have a separate db for each version? A pointer to a precise definition of what constitutes a version of an app would be appreciated (http://code.google.com/ appengine/docs/configuringanapp.html doesn't tell me much). --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: user login URLs in html/javascript
Gipsy, It is a consequence of my chosen design, to allow both "clients" and "organizers" to enter the app via the same door. This way the organizers can identify themselves before they enter the app so that they need not be asked later who they are, because I was having difficulty coding something like a javascript-confirm-dialog in python, to ask that question, so I elected to do it this way. I didn't anticipate that this would be difficult, but it was pretty hard to do, too, as it turned out. My knowledge of oop and client/server programming is very limited, as you might have deduced. The only way I was able to build the organizer's wrapper on this application was that I stumbled on a coding genius who took me under his wing in an all day Saturday Google hackathon. All I had coded was the client's system when I arrived at the hackathon and when I left I had the organizer's portion, but it did not quite work the way I wanted, so I have been tweaking it ever since. Is that more than you wanted to know? Is that an answer to your question? On Jan 11, 9:59 pm, "Gipsy Gopinathan" wrote: > I am gald that you have resolved it. But still still wondering why you have > to hard code these links > Brian in Atlanta --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Limit on number of entities per transaction
if you mean in a single db.put() or db.delete() call, yes, the limit is 500. if you mean in an actual datastore transaction, with multiple put/delete calls, then no, there's no limit, just the overall http request deadline. --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: versions share db?
The entities in the datastore are shared, the code and static files are not. I don't think splitting the datastore per app version is such a good idea. What if the data is added to v1 of the datastore while you are testing v2? At the same time you are modifying your v2 datastore (otherwise, you wouldn't need different datastores per app version). You would have to merge somehow the data from v1 to v2 before switching to v2. On Jan 12, 1:46 pm, Roman wrote: > It appears that different versions of an app share the db. Do they > share other stuff? Is it possible to have a separate db for each > version? A pointer to a precise definition of what constitutes a > version of an app would be appreciated (http://code.google.com/ > appengine/docs/configuringanapp.html doesn't tell me much). --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Limit on number of entities per transaction
Thanks for the prompt response. I meant the datastore transaction as I was looking to see how many put calls should I have to guarantee that I hit no limits. On Sun, Jan 11, 2009 at 9:05 PM, ryan > wrote: > > if you mean in a single db.put() or db.delete() call, yes, the limit > is 500. if you mean in an actual datastore transaction, with multiple > put/delete calls, then no, there's no limit, just the overall http > request deadline. > > > > --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---