[google-appengine] Re: Application Main Logs not working
How to I tell you my app ID with out telling everyone else? I have only one app ID so I assume there is some mapping from gmail address to app ID. My logs page has not changed since 7-13. The logs page log level values do not reflect either Sun logger levels or Apache log4j levels. Where does trace and config fine, finer and finest show up? I was allowing exceptions to pass up the stack beyond my servlet code. Maybe that caused the HttpServletResponse.SC_INTERNAL_SERVER_ERROR 500 error code. Richard Jeff S (Google) wrote: > Hi Richard, > > Could you tell us your app ID and one of the request which resulted in a > 500 error? I'd be happy to look into this but there are quite a few > things which could cause a 500 error, including typos in source code and > such. > > Thank you, > > Jeff > > On Wed, Jul 15, 2009 at 11:38 AM, richard emberson > mailto:richard.ember...@gmail.com>> wrote: > > > > When I send in a request that ought to log something > at the INFO level, nothing shows up in the Log viewer. > > When I make some requests I get a 500 response code, > internal server error, but nothing shows up in the > Log viewer. So, I can not tell what the problem is. > > Richard > -- > Quis custodiet ipsos custodes > > > > > > -- Quis custodiet ipsos custodes --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Application Main Logs not working
When I send in a request that ought to log something at the INFO level, nothing shows up in the Log viewer. When I make some requests I get a 500 response code, internal server error, but nothing shows up in the Log viewer. So, I can not tell what the problem is. Richard -- Quis custodiet ipsos custodes --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Application Main Logs severity level
The severity levels in the log viewer are: DEBUG INFO WARNING ERROR CRITICAL The Java logger has levels: FINEST FINER FINE CONFIG INFO WARNING SEVERE What is the mapping between the two? Thanks Richard -- Quis custodiet ipsos custodes --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Your application is exceeding a quota
Nick, Again, thank you for your response. Two Items First Item: Is there some Java API to the short term quotas so the I can write code that "knows" how fast to go and when to stop? If not, are the upload short term rate limits published somewhere? Are they based on number of requests (which I can control), number of bytes stored per minute (which I can control), GAE cpu time (which I can not measure) or some other measure? Second Item: Thanks for talking about the Python bulk loader. It gave me an idea. Just as I extract data from the protocol layer without creating any Entities, during a bulk load I can push data into the protocol layer without creating any Entities. As you know, an Entity only live in client application code and during a bulk load there is no need to create any of them. Each Entity has a map with String names and values. If you have tens of thousands of these it really consumes CPU and memory. So, just as I got a 20% performance improvement during extraction, I expect to get an performance improvements during loading. Regards, Richard Nick Johnson (Google) wrote: > On Wed, Jul 15, 2009 at 6:14 PM, richard > emberson wrote: >> Nick, >> >> Thank you for the response. >> >> I have tens of thousands of records to load. If I load them >> all at once or "rate-limit" the load, wont I run out of the >> short term quotas just the same? Or did you mean that I >> ought to "rate-limit" my load over a number of days or weeks? > > The problem you're running into is loading so rapidly you're hitting > the very short term quotas intended to prevent you consuming all your > daily quota at once. If you rate-limit enough, you can avoid hitting > the short-term quotas while still staying within the daily quotas. > Whether or not you need to rate limit enough to cover more than one > day, or buy extra quota, is another issue. > >> I am trying to determine if with large datasets, GAE is an >> adequate platform onto which the application I have in mind >> can be hosted. Currently, I am doing an evaluation. I've >> not yet built the application because I want to know if >> GAE has adequate performance. >> >> I've have already re-written the client-side code that >> extracts the data from the protocol layer and achieved a >> 20% performance increase over the shipped 1.2.2 sdk on the >> production GAE server (my new code was only 12% to 15% >> faster on the >> local development server so 20% was unexpected). >> So, performance is critical for me - performance against >> large dataset. >> >> I don't know if the Python bulkloader will be an improvement. >> I ship the data up as csv blocks with are parsed into Entities >> and then stored. Pretty simple. > > The Python bulk loader does all the translation into entities on the > client side, and then uses remote_api to send the encoded data over. > This inevitably leads to less CPU utilization than parsing it yourself > on the server. Nevertheless, the main reason I recommended the Python > bulk loader is because it has support for concurrency and > rate-limiting built right in. > >> Concerning the speed of deleting existing data. You suggested >> using key-only queries. In my initial email that you responded >> to, I had a short code snippet where, in deed, I set the >> query to use keys only. So, was the code incorrect? > > Sorry, I didn't read the snippet in enough detail. > > -Nick Johnson > >> Richard Emberson >> >> >> Nick Johnson (Google) wrote: >>> Hi Richard, >>> >>> You're running into short term quotas, which are designed to prevent >>> you exhausting your entire quota for the day in one go. You need to >>> rate-limit your bulk loading code, and/or pay for additional quota. >>> Even enabling billing without setting a high limit will increase your >>> short term quotas automatically. >>> >>> You should also look at your bulk loading code and make sure it's as >>> efficient as possible. One possibility is to use the Python >>> bulkloader. >>> >>> As far as deletion goes, make sure you are doing key-only queries to >>> get the key to delete, which will save on CPU time and timeouts. >>> >>> -Nick Johnson >>> >>> On Wed, Jul 15, 2009 at 12:11 AM, richard >>> emberson wrote: >>>> So, once again, I've tried to upload some data. >>>> >>>> After a couple, I guess, thousand records I start >>>> getting HttpServletResponse.SC_FORBIDDEN from >>>> the app engine server. >
[google-appengine] Re: Eating one's own dog food
I understand that BigTable is behind GAE, but my concern is more with GAE performance and quotas. If GAE had existed when Larry and Sergey were developing their pagerack algorithm, would they have used GEA for evaluation? I have my doubts. They would quickly reach quota limits, way before they knew if they had a viable idea. Richard Tony wrote: > Though I realize this is not exactly what you're asking, the concept > of GAE is that it exposes some of the infrastructure that all Google > applications rely on (i.e. Datastore) for others to use. So, in a > sense, Google's various applications were using App Engine before App > Engine existed. As far as I know, every Google service runs on the > same homogeneous infrastructure, which is part of what makes it so > reliable (and why the only available languages are Python and Java, > languages used internally at Google). > > But I don't work there, so maybe I'm completely off-base. > > On Jul 15, 12:53 pm, richard emberson > wrote: >> Eating one's own dog >> foodhttp://en.wikipedia.org/wiki/Eating_one's_own_dog_food >> or in this case: >> Using one's own cloud. >> >> Amazon' cloud is based upon the IT technology they use >> within Amazon. >> Salesforce.com's Force.com offering is what they used to >> build their CRM system. >> >> These cloud vendors "Eat their own dog food". >> >> If a cloud vendor does not use their cloud offering for >> their other products and/or internal systems, one >> would have to assume that the cloud is viewed as >> a technology ghetto within their own corporation - good >> enough for others but not for ourselves. >> >> So, concerning the Google App Engine, are other groups >> within Google clamoring to port or build their offerings >> on top of the App Engine? If so, please be specific, what >> Google products and infrastructure and what are the schedules >> for their hosting on GAE? >> >> Is the GAE group supporting the Google Docs group as they >> move to use GAE? How about gmail, will the Google Gmail >> group be relying on GAE support? I have not seen emails >> from either of those internal Google groups on the GAE >> mailing list. Lastly, when will Google search be supported >> by the GAE group; >> >> Will those groups have to live under the same quota restrictions >> while they evaluate using GAE? If not, why not? If they >> are unreasonable for an internal evaluation, what makes them >> reasonable for an external evaluation? >> >> Evaluating whether or not GAE should be used for a particular >> application is not FREE even if one gets a very small slice >> of GAE resources with which to do the evaluation. >> Tens or hundreds of hours go into determine if GAE has >> the right characteristics and quotas that limit how fast one >> can work makes it worse. (Yes one can $$ for higher quotas, >> but during the evaluation phase $$ is out of the question.) >> >> Richard Emberson >> >> -- >> Quis custodiet ipsos custodes > > > -- Quis custodiet ipsos custodes --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: Your application is exceeding a quota
Nick, Thank you for the response. I have tens of thousands of records to load. If I load them all at once or "rate-limit" the load, wont I run out of the short term quotas just the same? Or did you mean that I ought to "rate-limit" my load over a number of days or weeks? I am trying to determine if with large datasets, GAE is an adequate platform onto which the application I have in mind can be hosted. Currently, I am doing an evaluation. I've not yet built the application because I want to know if GAE has adequate performance. I've have already re-written the client-side code that extracts the data from the protocol layer and achieved a 20% performance increase over the shipped 1.2.2 sdk on the production GAE server (my new code was only 12% to 15% faster on the local development server so 20% was unexpected). So, performance is critical for me - performance against large dataset. I don't know if the Python bulkloader will be an improvement. I ship the data up as csv blocks with are parsed into Entities and then stored. Pretty simple. Concerning the speed of deleting existing data. You suggested using key-only queries. In my initial email that you responded to, I had a short code snippet where, in deed, I set the query to use keys only. So, was the code incorrect? Richard Emberson Nick Johnson (Google) wrote: > Hi Richard, > > You're running into short term quotas, which are designed to prevent > you exhausting your entire quota for the day in one go. You need to > rate-limit your bulk loading code, and/or pay for additional quota. > Even enabling billing without setting a high limit will increase your > short term quotas automatically. > > You should also look at your bulk loading code and make sure it's as > efficient as possible. One possibility is to use the Python > bulkloader. > > As far as deletion goes, make sure you are doing key-only queries to > get the key to delete, which will save on CPU time and timeouts. > > -Nick Johnson > > On Wed, Jul 15, 2009 at 12:11 AM, richard > emberson wrote: >> >> So, once again, I've tried to upload some data. >> >> After a couple, I guess, thousand records I start >> getting HttpServletResponse.SC_FORBIDDEN from >> the app engine server. >> >> On the Dashboard it says: >> >> Your application is exceeding a quota: CPU Time >> Your application is exceeding a quota: Datastore CPU Time >> >> but under Resource, CPU Time usage is at 34% >> and Stored Data usage is at 4%. >> >> I am trying to develop an application on GAE. >> I will need to load tens of thousands or >> a couple of hundred thousand entities as part >> of testing the application. I will then want >> to delete those entities. >> >> Currently, I can only load a couple of hundred >> before the app engine starts rejecting additional >> uploads. And I can not delete any of them - I >> keep getting timeouts - even if I try to delete only >> 10. >> >> Is there some upload per minute quota or something? >> And, whats the magic to delete stuff. >> >> The following code causes timeouts: >> >> DatastoreService ds = DatastoreServiceFactory.getDatastoreService(); >> final Query q = new Query(kindName); >> q.setKeysOnly(); >> >> final Iterable entities = ds.prepare(q).asIterable( >> FetchOptions.Builder.withLimit(count)); >> KeyIterable ki = new KeyIterable(entities); >> ds.delete(ki); >> int numberDeleted = ki.getCount(); >> return numberDeleted; >> >> >> >> Richard >> >> -- >> Quis custodiet ipsos custodes >> > > > -- Quis custodiet ipsos custodes --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Eating one's own dog food
Eating one's own dog food http://en.wikipedia.org/wiki/Eating_one's_own_dog_food or in this case: Using one's own cloud. Amazon' cloud is based upon the IT technology they use within Amazon. Salesforce.com's Force.com offering is what they used to build their CRM system. These cloud vendors "Eat their own dog food". If a cloud vendor does not use their cloud offering for their other products and/or internal systems, one would have to assume that the cloud is viewed as a technology ghetto within their own corporation - good enough for others but not for ourselves. So, concerning the Google App Engine, are other groups within Google clamoring to port or build their offerings on top of the App Engine? If so, please be specific, what Google products and infrastructure and what are the schedules for their hosting on GAE? Is the GAE group supporting the Google Docs group as they move to use GAE? How about gmail, will the Google Gmail group be relying on GAE support? I have not seen emails from either of those internal Google groups on the GAE mailing list. Lastly, when will Google search be supported by the GAE group; Will those groups have to live under the same quota restrictions while they evaluate using GAE? If not, why not? If they are unreasonable for an internal evaluation, what makes them reasonable for an external evaluation? Evaluating whether or not GAE should be used for a particular application is not FREE even if one gets a very small slice of GAE resources with which to do the evaluation. Tens or hundreds of hours go into determine if GAE has the right characteristics and quotas that limit how fast one can work makes it worse. (Yes one can $$ for higher quotas, but during the evaluation phase $$ is out of the question.) Richard Emberson -- Quis custodiet ipsos custodes --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Your application is exceeding a quota
So, once again, I've tried to upload some data. After a couple, I guess, thousand records I start getting HttpServletResponse.SC_FORBIDDEN from the app engine server. On the Dashboard it says: Your application is exceeding a quota: CPU Time Your application is exceeding a quota: Datastore CPU Time but under Resource, CPU Time usage is at 34% and Stored Data usage is at 4%. I am trying to develop an application on GAE. I will need to load tens of thousands or a couple of hundred thousand entities as part of testing the application. I will then want to delete those entities. Currently, I can only load a couple of hundred before the app engine starts rejecting additional uploads. And I can not delete any of them - I keep getting timeouts - even if I try to delete only 10. Is there some upload per minute quota or something? And, whats the magic to delete stuff. The following code causes timeouts: DatastoreService ds = DatastoreServiceFactory.getDatastoreService(); final Query q = new Query(kindName); q.setKeysOnly(); final Iterable entities = ds.prepare(q).asIterable( FetchOptions.Builder.withLimit(count)); KeyIterable ki = new KeyIterable(entities); ds.delete(ki); int numberDeleted = ki.getCount(); return numberDeleted; Richard -- Quis custodiet ipsos custodes --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: RAM Question
Yea, I wondered about that also. The Hundred Years war did start on 1337. Elite. Paul Trippett wrote: > > number of processors is 1337 (and I thought my quad was special). > > that spells leet in hacker speak i wonder if thats a piece of google humour > > > -- Quis custodiet ipsos custodes --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: RAM Question
Boys, boys, boys (and girls), the first thing I did was to figure some simple facts about the app engine server environemnt: memory is about 100MB number of processors is 1337 (and I thought my quad was special). jdk version is 1.6.0_13 running the Java HotSpot(TM) Client VM (NOT the Server VM?) Server is running Linux. So, if every application had one processor to itself and each had 2GB of memory, then RME Wooble wrote: > It seems unlikely, if only because populating GB of RAM from the > datastore would involve fetching at least 2000 objects from the > datastore before you hit the 30 second timeout. > > I don't believe the actual limit has been published, but I'd guess > it's far lower than 2GB. > > On Jul 8, 12:35 pm, Rodion wrote: >> Hello, >> >> i want my application use 2 GB of RAM. Is it possible or not? >> >> Thank you! > > > -- Quis custodiet ipsos custodes --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---
[google-appengine] Re: HIPAA requirements vs. AppEngine security guidelines
Not going to happen. The IT requirements for Google would cost far more than the couple of applications that might need HIPAA. They would have to have a completely separate group with their own machines, passwords, procedures, etc. with a real wall (both material wall and software/hardware wall) between the group and the rest of Google or all of Google would have to be HIPAA compliant. So, how much is it worth for Google? Not much. RME Ken wrote: > Hi, > > I'm researching the feasibility of running a healthcare app on the > AppEngine cloud. I've read through the AE terms of service and they > don't say much about the actual security guidelines other than > deferring to the boilerplate Google security policy. I have no doubt > there are internal documents detailing the exact security guarantees > provided by Google's infrastructure, but that information is not > readily available to the public. > > It's been a full year since the last time HIPAA was discussed in this > group. Now that SSL support has been enabled, data transfer > constraints can be met with ease. So, what's the story today with GAE > and HIPAA compliance? Are the App Engine's data storage and transfer > mechanisms compatible with the guidelines set out by HIPAA? > > Google Apps documentation has quite a bit more security information, > such as specifying annual SAS 70 Type II audits. I'm not familiar > with this particular security audit, but some quick research seems to > indicate that SAS 70 audit controls are mostly a superset of HIPAA > guidelines. However, there are some aspects of HIPAA compliance that > seem to be difficult to implement in a distributed database system, so > any reassurances from the Google App Engine folks in this regard would > be most appreciated. > > Thanks! > > Ken > > > > -- Quis custodiet ipsos custodes --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Google App Engine" group. To post to this group, send email to google-appengine@googlegroups.com To unsubscribe from this group, send email to google-appengine+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en -~--~~~~--~~--~--~---