[ 
https://issues.apache.org/jira/browse/DTACLOUD-440?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13565498#comment-13565498
 ] 

Marios Andreou commented on DTACLOUD-440:
-----------------------------------------

[from David Lutterkort on [email protected] mailing list] :


ACK, with one nit:

> diff --git a/server/lib/deltacloud_rack.rb b/server/lib/deltacloud_rack.rb
> index 8ddcb41..1a1d5b3 100644
> --- a/server/lib/deltacloud_rack.rb
> +++ b/server/lib/deltacloud_rack.rb
> @@ -68,6 +68,10 @@ module Deltacloud
>  
>      set :views, File.join(File.dirname(__FILE__), '..', 'views')
>  
> +    get '/robots.txt' do
> +      File.read(File.join('public', 'robots.txt'))
> +    end

This should be

        get '/robots.txt' do
                send_file File.join('public', 'robots.txt')
        end

so that we get the right content-type in the response.

David



                
> We need a robots.txt
> --------------------
>
>                 Key: DTACLOUD-440
>                 URL: https://issues.apache.org/jira/browse/DTACLOUD-440
>             Project: DeltaCloud
>          Issue Type: Bug
>          Components: Server
>            Reporter: David Lutterkort
>            Assignee: Marios Andreou
>         Attachments: 
> 0001-Activate-robots.txt-for-Deltacloud-servers-DTACLOUD-.patch
>
>
> The public endpoints get crawled leading to lots of 403's. We should have a 
> /robots.txt that tells crawlers not to bother looking any further.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

Reply via email to