Wow!

Nice reply man!

I saw the other post talking about robots.txt but my app is already
blessed and its not working. I'm using "live http headers" (a firefox
extension) and my app robots.txt is returning a 304 not modified.

I'm already using google webmaster tools but now its useless because
googlebot is blocked.

Thank's for your reply!

#sorry for my english, i'm spanish

On 7 abr, 02:34, stephan <[EMAIL PROTECTED]> wrote:
> Masylum,
> According to Orion Henry last February 29th on this Google Group, he
> said "That's a feature that will be available for blessed accounts a
> little ways down the line."
>
> On a side-note, when you make your request for "blessing" to them --
> do let them know their own robots.txt is working -- but only by
> chance.
> If you go tohttp://heroku.com/robots.txtor tohttp://www.heroku.com/robots.txt
>
> You'll find that they're robots.txt is written correctly, it's just
> that it is returning the wrong http status code in both cases.
> It's returning a 301 instead of a 200 OK. A 301 status code means that
> the content of that file is being dynamically generated.
>
> And the google bot won't recognize a robots.txt (or a sitemap) that
> returns a 301. Now luckily for them, and they probably knew that
> already since they're saying it will only "be available down the
> line", they're not blocking anything using that file. So the fact that
> google is not recognizing that file as valid doesn't matter. If the
> google bot doesn't see a valid robots.xml file with a right status
> code, its default behavior is to index everything anyway.
>
> On the other hand, if you go to your own public app's robots.txt,
> let's say your app is called "foobar2000"
> You'll find thathttp://foobar2000.heroku.com/robots.txtretuns the
> right 200 OK
>
> That's because your own robots.txt request is being intercepted by
> their http/proxy web server, and the http web server will return the
> right 200 OK code. Ruby can also return the right 200 OK status code,
> it's just that you have to tell it explicitly to do so, and it's not
> an error most developers have come across unless they get first bitten
> by it.
>
> To double-check the status codes of an http request, you use your own
> sniffer. An easy sniffer to install is "TamperData", a firefox
> extention.https://addons.mozilla.org/en-US/firefox/addon/966(it must be 
> enabled
> once installed, and then it must be explicitly started from its dialog
> menu)
>
> Anyway, good luck Masylum, and if you haven't done so already -- try
> the "Google Webmaster Tools" when you get this robots.txt 
> working.https://www.google.com/webmasters/tools/siteoverview?hl=en (it's an
> important tool)
>
> - Stephanhttp://quickspikes.com
>
> On Apr 6, 5:05 am, "[EMAIL PROTECTED]" <[EMAIL PROTECTED]> wrote:
>
> > Anybody knows how to change the robots.txt?
>
> > I modified the file in the public folder but the bots remain blocked.
>
> > Maybe I'm doing something wrong, but I need to be accesible for
> > googlebot at least.
>
> > Thank's in advance! :)
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Heroku" group.
To post to this group, send email to heroku@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/heroku?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to