I have a site with multiple subdomains and I want the named subdomains robots.txt to be different from the www one.
I tried to use .htaccess, but the FastCGI doesn\'t lo
If you can't configure your http server to do this before the request is sent to rails, I would just setup a 'robots' controller that renders a template like:
def show_robot
subdomain = # get subdomain, escape
render :text => open('robots.#{subdomain}.txt').read, :layout => false
end
Depending on what you're trying to accomplish you could also use a single template instead of a bunch of different files.
I liked TA Tyree's solution but it is very Rails 2.x centric so here is what I came up with for Rail 3.1.x
mime_types.rb
Mime::Type.register "text/plain", :txt
By adding the format in the routes you don't have to worry about using a respond_to block in the controller. routes.rb
match '/robots.txt' => 'robots#robots', :format => "text"
I added a little something extra on this one. The SEO people were complaining about duplicated content both in subdomains and in SSL pages so I created a two robot files one for production and one for not production which is also going to be served with any SSL/HTTPS requests in production.
robots_controller.rb
class RobotsController < ApplicationController
def robots
site = request.host
protocol = request.protocol
(site.eql?("mysite.com") || site.eql?("www.mysite.com")) && protocol.eql?("http://") ? domain = "production" : domain = "nonproduction"
robots = File.read( "#{Rails.root}/config/robots-#{domain}.txt")
render :text => robots, :layout => false
end
end
As of Rails 6.0 this has been greatly simplified.
By default, if you use the :plain option, the text is rendered without using the current layout. If you want Rails to put the text into the current layout, you need to add the layout: true option and use the .text.erb extension for the layout file. Source
class RobotsController < ApplicationController
def robots
subdomain = request.subdomain # Whatever logic you need
robots = File.read( "#{Rails.root}/config/robots.#{subdomain}.txt")
render plain: robots
end
end
In routes.rb
get '/robots.txt', to: 'robots#robots'
Why not to use rails built in views?
In your controller add this method:
class StaticPagesController < ApplicationController
def robots
render :layout => false, :content_type => "text/plain", :formats => :txt
end
end
In the view create a file: app/views/static_pages/robots.txt.erb
with robots.txt content
In routes.rb
place:
get '/robots.txt' => 'static_pages#robots'
Delete the file /public/robots.txt
You can add a specific business logic as needed, but this way we don't read any custom files.
For Rails 3:
Create a controller RobotsController:
class RobotsController < ApplicationController
#This controller will render the correct 'robots' view depending on your subdomain.
def robots
subdomain = request.subdomain # you should also check for emptyness
render "robots.#{request.subdomain}"
end
end
Create robots views (1 per subdomain):
Add a new route in config/routes.rb: (note the :txt format option)
match '/robots.txt' => 'robots#robots', :format => :txt
And of course, you should declare the :txt format in config/initializers/Mime_types.rb:
Mime::Type.register "text/plain", :txt
Hope it helps.
Actually, you probably want to set a mime type in mime_types.rb
and do it in a respond_to
block so it doesn't return it as 'text/html'
:
Mime::Type.register "text/plain", :txt
Then, your routes would look like this:
map.robots '/robots.txt', :controller => 'robots', :action => 'robots'
For rails3:
match '/robots.txt' => 'robots#robots'
and the controller something like this (put the file(s) where ever you like):
class RobotsController < ApplicationController
def robots
subdomain = # get subdomain, escape
robots = File.read(RAILS_ROOT + "/config/robots.#{subdomain}.txt")
respond_to do |format|
format.txt { render :text => robots, :layout => false }
end
end
end
at the risk of overengineering it, I might even be tempted to cache the file read operation...
Oh, yeah, you'll almost certainly have to remove/move the existing 'public/robots.txt'
file.
Astute readers will notice that you can easily substitute RAILS_ENV
for subdomain
...