Rails manually redirecting from naked domain - ruby-on-rails

So currently I am manually directing from a naked domain due to restrictions with my hosting provider (Heroku). Everything works just fine. The problem is that if a users visits mydomain.com/route, a redirect will be issued back to www.mydomain.com without the /route. How would I go about re-appending the route, but still redirecting to www. ?
class ApplicationController < ActionController::Base
protect_from_forgery
before_filter :ensure_domain
APP_DOMAIN = 'www.domain.com'
def index
end
def ensure_domain
if Rails.env.production?
if request.env['HTTP_HOST'] != APP_DOMAIN
redirect_to "http://#{APP_DOMAIN}", :status => 301
end
end
end
end
EDIT
I removed my code above from my ApplicationController, and opted for using the refraction gem as suggested by hurikhan77, which solved my problem.
Here is refraction_rules.rb I used.
Refraction.configure do |req|
if req.host == "domain.com"
req.permanent! :host => "www.domain.com"
end
end

I suggest using the refraction gem for this: http://rubygems.org/gems/refraction

Ideally, you would set up rules like that in your web server configuration. Requests would become faster, because they would not even reach the rails stack. There would be no need to add any code to your app either.
However, if you are running in some restricted environment, like heroku, I'd advise adding a rack middleware. (Just for guidelines, can't guarantee if this particular code is bug free)
class Redirector
SUBDOMAIN = 'www'
def initialize(app)
#app = app
end
def call(env)
#env = env
if redirect?
redirect
else
#app.call(env)
end
end
private
def redirect?
# do some regex to figure out if you want to redirect
end
def redirect
headers = {
"location" => redirect_url
}
[302, headers, ["You are being redirected..."]] # 302 for temp, 301 for permanent
end
def redirect_url
scheme = #env["rack.url_scheme"]
if #env['SERVER_PORT'] == '80'
port = ''
else
port = ":#{#env['SERVER_PORT']}"
end
path = #env["PATH_INFO"]
query_string = ""
if !#env["QUERY_STRING"].empty?
query_string = "?" + #env["QUERY_STRING"]
end
host = "://#{SUBDOMAIN}." + domain # this is where we add the subdomain
"#{scheme}#{host}#{path}#{query_string}"
end
def domain
# extract domain from request or get it from an environment variable etc.
end
end
You can also test the whole thing in isolation
describe Redirector do
include Rack::Test::Methods
def default_app
lambda { |env|
headers = {'Content-Type' => "text/html"}
headers['Set-Cookie'] = "id=1; path=/\ntoken=abc; path=/; secure; HttpOnly"
[200, headers, ["default body"]]
}
end
def app()
#app ||= Rack::Lint.new(Redirector.new(default_app))
end
it "redirects unsupported subdomains" do
get "http://example.com/zomg?a=1"
last_response.status.should eq 301
last_response.header['location'].should eq "http://www.example.com/zomg?a=1"
end
# and so on
end
Then you can add it to production (or any preferred environments) only
# production.rb
# ...
config.middleware.insert_after 'ActionDispatch::Static', 'Redirector'
If you want to test it in development, add the same line to development.rb and add a record to your hosts file (usually /etc/hosts) to treat yoursubdomain.localhost as 127.0.0.1

Not sure if this is the best solution but you could regex the request.referrer and pull out anything after .com and append it to the APP_DOMAIN
Or I guess you could just take out everything before the first . in request.env['HTTP_HOST'] add replace with http://www. assuming you don't plan on using subdomains.

Related

rack::proxy does not work to specific path

Got stuck not sure what is happening.
If I do not specify path and run perform request it works fine.
def perform_request(env)
env["HTTP_HOST"] = "node:8765"
super(env)
end
However when I want to proxy if match some path it does not work.
def perform_request(env)
request = Rack::Request.new(env)
if request.path =~ %r{^/api/check}
#puts "Weird!!!!!"
env["HTTP_HOST"] = "node:8765"
super(env)
else
#app.call(env)
end
end
Configured as middleware(config/application.rb):
config.middleware.use NProxy, {ssl_verify_none: true}

Rails: dynamic robots.txt with erb

I'm trying to render a dynamic text file (robots.txt) in my Rails (3.0.10) app, but it continues to render it as HTML (says the console).
match 'robots.txt' => 'sites#robots'
Controller:
class SitesController < ApplicationController
respond_to :html, :js, :xml, :css, :txt
def robots
#site = Site.find_by_subdomain # blah blah
end
end
app/views/sites/robots.txt.erb:
Sitemap: <%= #site.url %>/sitemap.xml
But when I visit http://www.example.com/robots.txt I get a blank page/source, and the log says:
Started GET "/robots.txt" for 127.0.0.1 at 2011-11-21 11:22:13 -0500
Processing by SitesController#robots as HTML
Site Load (0.4ms) SELECT `sites`.* FROM `sites` WHERE (`sites`.`subdomain` = 'blah') ORDER BY created_at DESC LIMIT 1
Completed 406 Not Acceptable in 828ms
Any idea what I'm doing wrong?
Note: I added this to config/initializers/mime_types, cause Rails was complaining about not knowing what the .txt mime type was:
Mime::Type.register_alias "text/plain", :txt
Note 2: I did remove the stock robots.txt from the public directory.
NOTE: This is a repost from coderwall.
Reading up on some advice to a similar answer on Stackoverflow, I currently use the following solution to render a dynamic robots.txt based on the request's host parameter.
Routing
# config/routes.rb
#
# Dynamic robots.txt
get 'robots.:format' => 'robots#index'
Controller
# app/controllers/robots_controller.rb
class RobotsController < ApplicationController
# No layout
layout false
# Render a robots.txt file based on whether the request
# is performed against a canonical url or not
# Prevent robots from indexing content served via a CDN twice
def index
if canonical_host?
render 'allow'
else
render 'disallow'
end
end
private
def canonical_host?
request.host =~ /plugingeek\.com/
end
end
Views
Based on the request.host we render one of two different .text.erb view files.
Allowing robots
# app/views/robots/allow.text.erb # Note the .text extension
# Allow robots to index the entire site except some specified routes
# rendered when site is visited with the default hostname
# http://www.robotstxt.org/
# ALLOW ROBOTS
User-agent: *
Disallow:
Banning spiders
# app/views/robots/disallow.text.erb # Note the .text extension
# Disallow robots to index any page on the site
# rendered when robot is visiting the site
# via the Cloudfront CDN URL
# to prevent duplicate indexing
# and search results referencing the Cloudfront URL
# DISALLOW ROBOTS
User-agent: *
Disallow: /
Specs
Testing the setup with RSpec and Capybara can be done quite easily, too.
# spec/features/robots_spec.rb
require 'spec_helper'
feature "Robots" do
context "canonical host" do
scenario "allow robots to index the site" do
Capybara.app_host = 'http://www.plugingeek.com'
visit '/robots.txt'
Capybara.app_host = nil
expect(page).to have_content('# ALLOW ROBOTS')
expect(page).to have_content('User-agent: *')
expect(page).to have_content('Disallow:')
expect(page).to have_no_content('Disallow: /')
end
end
context "non-canonical host" do
scenario "deny robots to index the site" do
visit '/robots.txt'
expect(page).to have_content('# DISALLOW ROBOTS')
expect(page).to have_content('User-agent: *')
expect(page).to have_content('Disallow: /')
end
end
end
# This would be the resulting docs
# Robots
# canonical host
# allow robots to index the site
# non-canonical host
# deny robots to index the site
As a last step, you might need to remove the static public/robots.txt in the public folder if it's still present.
I hope you find this useful. Feel free to comment, helping to improve this technique even further.
One solution that works in Rails 3.2.3 (not sure about 3.0.10) is as follows:
1) Name your template file robots.text.erb # Emphasis on text vs. txt
2) Setup your route like this: match '/robots.:format' => 'sites#robots'
3) Leave your action as is (you can remove the respond_with in the controller)
def robots
#site = Site.find_by_subdomain # blah blah
end
This solution also eliminates the need to explicitly specify txt.erb in the render call mentioned in the accepted answer.
For my rails projects I usually have a seperate controller for the robots.txt response
class RobotsController < ApplicationController
layout nil
def index
host = request.host
if host == 'lawc.at' then #liveserver
render 'allow.txt', :content_type => "text/plain"
else #testserver
render 'disallow.txt', :content_type => "text/plain"
end
end
end
Then I have views named : disallow.txt.erb and allow.txt.erb
And in my routes.rb I have
get "robots.txt" => 'robots#index'
I don't like the idea of robots.txt reaching my Web Server.
If you are using Nginx/Apache as your reverse proxy, Static files would be much faster to handle by them than the request reaching rails itself.
This is much cleaner, and I think this is more faster too.
Try using the following setting.
nginx.conf - for production
location /robots.txt {
alias /path-to-your-rails-public-directory/production-robots.txt;
}
nginx.conf - for stage
location /robots.txt {
alias /path-to-your-rails-public-directory/stage-robots.txt;
}
I think the problem is that if you define respond_to in your controller, you have to use respond_with in the action:
def robots
#site = Site.find_by_subdomain # blah blah
respond_with #site
end
Also, try explicitly specifying the .erb file to be rendered:
def robots
#site = Site.find_by_subdomain # blah blah
render 'sites/robots.txt.erb'
respond_with #site
end

Rails 3.1 Routes: How to add a locale to beginning of URI when missing?

I'm trying to insert a locale at the beginning of a request URI in a Rails 3.1 app if it is missing. I created a Ruby script that does what I want:
uri = "/products"
re = /\A\/((?:[a-z]{2,2})(?:[-|_](?:[A-Z]{2,2}))?)(\/.*)\Z/
unless uri =~ re
uri = "/en#{uri}"
end
puts uri
So, if the request URI is /en-GB/products (the locale is already present), it doesn't do anything. If it is /products (like the example above), it spits out /en/products.
Now I'm trying to get it to work in my routes file. Here's what I've attempted:
match "(*all)", :to => redirect do |params, request|
uri = request.path_info
re = /\A\/((?:[a-z]{2,2})(?:[-|_](?:[A-Z]{2,2}))?)(\/.*)\Z/
unless uri =~ re
uri = "/en#{uri}"
end
"#{request.scheme}://#{request.host_with_port}#{uri}"
end
My problem is that I can't even get inside the match block. I keep getting an ArgumentError: redirection argument not supported.
I've tried changing it to match "(*all)" => redirect do |params, request| to no avail.
I'm looking at the Rails 3.1 API documentation for these examples.
Is the routes file the place to try and do this? It makes the most sense to me.
Introducing logic in routes smells for me. Controllers are meant for that, and I would use optional scope in routes and before_filter in controller with redirect_to
routes.rb - keep it simple:
scope '(:locale)', :constraints => {:locale=> /[a-z]{2}(-[A-Z]{2})?/ } do
match 'url1' ...
match 'other' ...
end
controller:
before_filter :check_locale
protected
def check_locale
redirect_to "/en#{request.path_info}" if params[:locale].blank?
end
(the above is written from memory)
I find these lines in a before_filter in the ActionController quite usefull.
These lines extracts a locale an redirects e.g. foo.com/fie to foo.com/en/fie (or wahtever locale the current locale is). If the user has a not supported locale, he gets a hint, that he can gon on with english...
def set_locale
params_locale = params[:locale]
if params_locale
if (!Supportedlocale::SUPPORTED.include?(params_locale))
redirect_to "/en/localenotsupported/"
end
end
language_locale = locale_from_accept_language
default_locale = I18n.default_locale
I18n.locale = params_locale || language_locale || default_locale
if params_locale.blank?
redirect_to "/#{I18n.locale}#{request.path_info}"
end
end
def locale_from_accept_language
accepted_lang = request.env['HTTP_ACCEPT_LANGUAGE']
if (!accepted_lang.nil?)
accepted_lang.scan(/^[a-z]{2}/).first
else
"en" #en is default!
end
end
In order to keep parameter like pagination do something like :
def check_locale
if params[:locale].blank?
I18n.locale = :en
redirect_to params.merge!(:locale => I18n.locale)
end
end
So
/controler/action?page=10&search=dada => /en/controler/action?page=10&search=dada

Geocoder, how to test locally when ip is 127.0.0.1?

I can't get geocoder to work correct as my local ip address is 127.0.0.1 so it can't located where I am correctly.
The request.location.ip shows "127.0.0.1"
How can I use a different ip address (my internet connection ip) so it will bring break more relevant data?
A nice clean way to do it is using MiddleWare. Add this class to your lib directory:
# lib/spoof_ip.rb
class SpoofIp
def initialize(app, ip)
#app = app
#ip = ip
end
def call(env)
env['HTTP_X_FORWARDED_FOR'] = nil
env['REMOTE_ADDR'] = env['action_dispatch.remote_ip'] = #ip
#status, #headers, #response = #app.call(env)
[#status, #headers, #response]
end
end
Then find an IP address you want to use for your development environment and add this to your development.rb file:
config.middleware.use('SpoofIp', '64.71.24.19')
For this I usually use params[:ip] or something in development. That allows me to test other ip addresses for functionality and pretend I'm anywhere in the world.
For example
class ApplicationController < ActionController::Base
def request_ip
if Rails.env.development? && params[:ip]
params[:ip]
else
request.remote_ip
end
end
end
I implemented this slightly different, and this works well for my case.
In application_controller.rb i have a lookup method which calls the Geocoder IP lookup directly passing in the results of request.remote_ip.
def lookup_ip_location
if Rails.env.development?
Geocoder.search(request.remote_ip).first
else
request.location
end
end
Then in config/environments/development.rb i monkey-patched the remote_ip call:
class ActionDispatch::Request
def remote_ip
"71.212.123.5" # ipd home (Denver,CO or Renton,WA)
# "208.87.35.103" # websiteuk.com -- Nassau, Bahamas
# "50.78.167.161" # HOL Seattle, WA
end
end
I just hard code some addresses, but you could do whatever you'd like here.
I had the same question. Here is how I implemented with geocoder.
#gemfile
gem 'httparty', :require => 'httparty', :group => :development
#application_controller
def request_ip
if Rails.env.development?
response = HTTParty.get('http://api.hostip.info/get_html.php')
ip = response.split("\n")
ip.last.gsub /IP:\s+/, ''
else
request.remote_ip
end
end
#controller
ip = request_ip
response = Geocoder.search(ip)
( code part with hostip.info from geo_magic gem, and based on the other answer to this question. )
now you can do something like response.first.state
This is an updated answer for geocoder 1.2.9 to provide a hardcoded IP for development and test environments. Just place this at the bottom of your config/initilizers/geocoder.rb:
if %w(development test).include? Rails.env
module Geocoder
module Request
def geocoder_spoofable_ip_with_localhost_override
ip_candidate = geocoder_spoofable_ip_without_localhost_override
if ip_candidate == '127.0.0.1'
'1.2.3.4'
else
ip_candidate
end
end
alias_method_chain :geocoder_spoofable_ip, :localhost_override
end
end
end
you may also do this
request.safe_location

Redirect 'myapp.com' to 'www.myapp.com' in rails without using htaccess?

Using Morph Labs' Appspace to deploy a site means no automated way to redirect 'myapp.com' to 'www.myapp.com' (and no access to .htacess).
Is there an in-rails way to do this? Would I need a plugin like subdomain-fu?
More specifically, I'm trying to do something like:
'myapp.com' => 'www.myapp.com'
'myapp.com/session/new' => 'www.myapp.com/session/new'
Basically, I always want the 'www' subdomain prepended on every request (because the SSL cert specifically has a common name of 'www.myapp.com').
Maybe something like this would do the trick:
class ApplicationController < ActionController::Base
before_filter :check_uri
def check_uri
redirect_to request.protocol + "www." + request.host_with_port + request.request_uri if !/^www/.match(request.host)
end
end
Carson's answer works great.
Here's the code to go the other way (www -> no www)
before_filter :check_uri
def check_uri
if /^www/.match(request.host)
redirect_to request.protocol + request.host_with_port[4..-1] + request.request_uri
end
end
I had to change Carson's answer to get this to work in Rails 3. I replaced request.uri with request.fullpath:
class ApplicationController < ActionController::Base
protect_from_forgery
Rails.env.production? do
before_filter :check_url
end
def check_url
redirect_to request.protocol + "www." + request.host_with_port + request.fullpath if !/^www/.match(request.host)
end
end
This worked great for me. I did make one small addition as I only wanted this behavior in my production environment:
def check_uri
redirect_to request.protocol + "www." + request.host_with_port + request.request_uri if !/^www/.match(request.host) if Rails.env == 'production'
end
I know this is answered, but I thought everyone else should know about the CodeRack: Canonical Host solution. This is really nice as it allows for env specific redirects. http://coderack.org/users/tylerhunt/middlewares/6-canonical-host
Here is a couple of different ways:
head :moved_permanently, :location => ‘http://www.newdomain.com’
another:
def rails_301
headers["Status"] = "301 Moved Permanently"
redirect_to "http://www.newdomain.com"
end
For those of you who are looking to also force SSL using heroku, this worked well for me, based on Heroku SSL on root domain
In my DNS settings I set up a URL / Forward record (DNS Simple)
URL foo.com 3600 http://www.foo.com
The CNAME setup only needs to be setup for WWW
CNAME www.foo.com 3600 providedssslendpoint.herokussl.com
I also had to setup and Alias for my root
ALIAS foo.com 3600 providedsslendpoint.herokussl.com
Then I decided to simply replace foo.com with an env variable ENV['SITE_HOST'] (where SITE_HOST might equal www.foo.com or test.foo.com) so I can have control via my heroku configuration. That way, I can control what happens in different environments. (for setting up env variables locally see https://github.com/bkeepers/dotenv)
For example, my test app uses test.foo.com as the url it also has its own SSL endpoint so that works fine for me.
before_filter :check_domain
def check_domain
if Rails.env.production? || Rails.env.testing? and request.host.downcase != ENV['SITE_HOST']
redirect_to request.protocol + ENV['SITE_HOST'] + request.fullpath, :status => 301
end
end
From now on, end users will always access www with forced SSL. Old links will suffer a small hang but nothing noticeable.

Resources