Often this is caused by having a default dynamic request handler like this:

- url: .*
  script: main.py


You can eliminate this issue by adding a static robots.txt file before
the dynamic handler:

- url: /robots\.txt
  static_files: robots.txt
  upload: robots\.txt

- url: .*
  script: main.py


On Thu, Dec 4, 2008 at 1:51 PM, Fred <[EMAIL PROTECTED]> wrote:
>
> Hi,
>
> I was just checking my warning logs in the dashboard and noticed this:
>
> 65.55.231.106 - - [04/12/2008:03:23:08 -0800] "GET /robots.txt HTTP/
> 1.1" 404 84 - -
>
> This request used a high amount of CPU, and was roughly 1.2 times over
> the average request CPU limit. High CPU requests have a small quota,
> and if you exceed this quota, your app will be temporarily disabled.
>
> Is CPU time reporting accurate, or is it still being perfected?
> >
>

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to google-appengine@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to