You'll probably be better off creating a static robots.txt file and
altering your .htaccess rewrite rule to allow direct access to .txt
files:

RewriteEngine on
RewriteRule !\.(js|ico|gif|jpg|png|css|txt)$ index.php

Performance wise, this will be much smarter, as I doubt you need a
dynamic robots.txt.

Shahar.


On Mon, 2008-01-21 at 05:07 -0800, digitalus_media wrote:
> I analyzed the robots.txt file and found it was returning this error:
> 
> Fatal error:  Uncaught exception 'Zend_Locale_Exception' with message
> 'Autodetection of Locale has been failed!' ....
> 
> i have set the locale in index.php:
> 
> $locale  = new Zend_Locale('en_US');
> 
> and have set a static route to the robots file:
> 
> //robots.txt
> $route = new Zend_Controller_Router_Route_Static(
>     'robots.txt',
>     array(
>         'module'     => 'public',
>         'controller' => 'systemPages',
>         'action'     => 'robots'
>     )
> );
> $router->addRoute('robots.txt', $route); 
> 
> action:
> 
>       function robotsAction()
>       {
>          $this->getResponse()->setHeader('Content-Type', 'text/plain'); 
>          $disallow = $this->_config->robots->toArray();
>          if(is_array($disallow)){
>              echo "User-agent: * \n";
>              foreach ($disallow as $dir) {
>                  echo "Disallow: " . $dir . " \n";
>              }
>          }
>       }
> 
> 

Attachment: signature.asc
Description: This is a digitally signed message part

Reply via email to