Multiple robots.txt for subdomains in rails

后端 未结 6 2089
轻奢々
轻奢々 2021-01-31 23:21

I have a site with multiple subdomains and I want the named subdomains robots.txt to be different from the www one.

I tried to use .htaccess, but the FastCGI doesn\'t lo

6条回答
  •  长发绾君心
    2021-01-31 23:39

    I liked TA Tyree's solution but it is very Rails 2.x centric so here is what I came up with for Rail 3.1.x

    mime_types.rb

    Mime::Type.register "text/plain", :txt
    

    By adding the format in the routes you don't have to worry about using a respond_to block in the controller. routes.rb

    match '/robots.txt'   => 'robots#robots',   :format => "text"
    

    I added a little something extra on this one. The SEO people were complaining about duplicated content both in subdomains and in SSL pages so I created a two robot files one for production and one for not production which is also going to be served with any SSL/HTTPS requests in production.

    robots_controller.rb

    class RobotsController < ApplicationController 
      def robots
         site = request.host
         protocol = request.protocol
         (site.eql?("mysite.com") || site.eql?("www.mysite.com")) && protocol.eql?("http://")  ? domain = "production" : domain = "nonproduction"
         robots = File.read( "#{Rails.root}/config/robots-#{domain}.txt")
         render :text => robots, :layout => false
      end
    end
    

提交回复
热议问题