I have a site with multiple subdomains and I want the named subdomains robots.txt to be different from the www one.
I tried to use .htaccess, but the FastCGI doesn\'t lo
I liked TA Tyree's solution but it is very Rails 2.x centric so here is what I came up with for Rail 3.1.x
mime_types.rb
Mime::Type.register "text/plain", :txt
By adding the format in the routes you don't have to worry about using a respond_to block in the controller. routes.rb
match '/robots.txt' => 'robots#robots', :format => "text"
I added a little something extra on this one. The SEO people were complaining about duplicated content both in subdomains and in SSL pages so I created a two robot files one for production and one for not production which is also going to be served with any SSL/HTTPS requests in production.
robots_controller.rb
class RobotsController < ApplicationController
def robots
site = request.host
protocol = request.protocol
(site.eql?("mysite.com") || site.eql?("www.mysite.com")) && protocol.eql?("http://") ? domain = "production" : domain = "nonproduction"
robots = File.read( "#{Rails.root}/config/robots-#{domain}.txt")
render :text => robots, :layout => false
end
end