Let's say I'm trying to create an application called Blue. Blue is a Ruby on Rails application that turns the background of any website blue. It also allows users to log in and keep track of the websites they've visited and turned blue.
In order to turn a website's background websites blue, I've created a web proxy that inserts <link HREF="http://www.example.com/blue.css" type="text/css"> into the response's body. The proxy is implemented as a rack application and is be placed inside the Rails routes using the approach from the Rack in Rails 3 Railscast:
root :to => BlueProxy, :constraints => { :subdomain => "proxy" }
I'm very concerned about security with this approach. I know by default the domain for the cookies in my application would be .example.com. If the user typed in a malicious URL, the website could manipulate the user's account. I could fix this by only allowing the www subdomain for cookies in the application. However, I'd also like the proxy to be able to store cookies for the proxied site as well.
Here are my three questions:
Is this a bad approach? Is there a better way to solve this problem?
What's the best way to keep sibling subdomain cookies separate in Rails?
Are there any other security concerns I'm missing?
This approach is dangerous, and I caution you against running a proxy for several reasons:
It brings up a host of legal issues ranging from people accessing illegal content to your hosting content for your own benefit (and modifying it).
Your bandwidth (and hosting fees) will explode if the site gets popular.
Loading content inside an iframe has ux issues, like the browser back button not quite performing as the user wants it to.
Running a proxy opens up several more attack vectors to your site (e.g. sending a permalink to a malicious site proxied through your site) that you'll have to consider from a security perspective.
Instead of running an open proxy (okay, maybe it's not completely open, but how hard is it for someone to sign up?) on your back end, consider using a browser extension or greasemonkey script on the front end that can get its set of rules from your rails app and then add the stylesheet changes on the client side.
Related
I'm using the rack-cors gem with Rails:
https://github.com/cyu/rack-cors
I need to whitelist ONE domain so that it will allow that domain through.
I would think that this would allow traffic from the whitelisted domain . I am making a POST request from https://reflective-basket.surge.sh/ to my rails app. (domain name modified for the sake of this post on stackoverflow)
However, POST requests will not go through. The destination Rails app says:
The page you were looking for doesn't exist.
You may have mistyped the address or the page may have moved.
If I remove the protect from forgery line from the application controller, (protect_from_forgery with: :exception), of course, the app allows all traffic through but this defeats the purpose of having a secure app.
I'm sure this is a common problem (needing a form on website A submit data to website B but only from a certain domain) but this just doesn't seem to work as I would have hoped. Any pointers? I'm open to making this work in any way that's possible.
I am preparing to work on a project where I need to display a dashboard from an online application. Unfortunately, the use of an API is currently not possible. The dashboard can be embedded in an iFrame. However, when it is displayed it will prompt the user viewing the dashboard to login to an account.
I have one paid account to this service. Are there any rails gems to login to the service before the iFrame is processed?
Or would a proxy within my rails app be a better route to go?
Any pointers are appreciated!
Neither a Rails gems nor a proxy within your rails will work and they same have the same limitation.
They are both running on the back-end, server side.
The authentication you need is client side.
Unless you mean proxy the ENTIRE thing, the auth request and all subsequent requests and user interactions with this dashboard. That should work but (see below)
The way authentication works (pretty much universally) is: once you log in to any system, it stores a cookie on your browser and then the browser sends that cookie for every subsequent request.
If you authenticate on the backend, that cookie will be sent to your rails code and will die there, and the users browser will never know about it.
Also - it is not possible to do the auth server side and capture the cookie and then have the user browse the site with their browser directly, for two reasons:
Sometimes auth cookies use information about the browser or HTTP client to encrypt the cookie, so sending the same cookie from a different client wont work
You can not tell a browser to send a cookie to a domain different than your own.
So your options are, off the top of my head right now:
If there is a login page that accepts form submissions from other domains, you could try to simulate a form submission directly to that sites "after login" page. (The page the user gets directed to once they fill up the login form). Any modern web framework as XSRF protection (Cross Site Request Forgery protection) and will disallow this approach for security reasons.
See if the auth this site uses has any kind of OAUTH, Single Sign On (SSO) or similar type of authentication integration that you can do. (Similar to an API, so you may have already explored this option)
Proxy all requests to this site through your server. You will have to rewrite the entire HTML so that all images, CSS, stylesheets, and all other assets are also routed through the proxy or else the URLs are rewritten in the HTML to not be relative. You might hit various walls if a site wasn't designed for this use case. From things like the site using relative URL's for assets that you aren't proxying, the site referencing non-relative URL's causing cross-domain errors, etc. Note its really hard to re-write every single last assets reference, its not only the HTML you're worried about, Javascript can have URL's in it too, and CSS can as well.
You could write a bookmarklet or a browser extension that logs the user into the site.
Have everyone install Lastpass
Have everyone install the TamperMonkey browser extension (and others like it for other browser), and write a small User Script to run custom javascript automatically to log the user in on that site
Scrape that site for the info you need and serve it on your own site.
OK I'm out of ideas. :)
I am helping to create a Rails app that uses Ember for a front end MVC. For the app, it is hosting user content accessed via subdomains. On the subdomains, the user can upload custom JS and CSS. What I'm wondering about is if token authentication on the root domain will be safe if stored in Ember from the custom JS people could upload and run on their subdomains?
Provided the following:
Don't use cookies on *.domain.com or use cookies at all.
They can't run (or really display it unescaped in any way) the JS/CSS on your main site.
The ember app with your token doesn't run on their sub-domain (obviously).
They can't put HTML in a file with a different extension or even Content-Type on your subdomain (or you aren't using cookies). They could direct a user's web browser there and it'd display the HTML. Be wary of phishing though (looks like it's your secure content). I can't imagine you could prevent this easily other than not using cookies -- without 100% ensuring properly formatted JS/CSS which would present all kinds of problems.
You can limit cookies to domain.com and www.domain.com, but I don't recommend it (prone to mistakes). If you don't somebody can make a GET request through CSS or ie. an image tag (not to mention JavaScript) and it'll send the authenticated cookies to your server. Remember unescaped input in their app can leave holes too.
If your token is stored in ember, and they have access to custom JS where the app is running of course it'll leave your token vulnerable. If you run your ember app only on the www.domain.com, avoid cookies, storing the token only locally/in JS, you might be okay.
If they just put HTML code in a file with another extension and direct people there it'll be interpreted as HTML.
My MVC site uses the antiforgeryToken code, which works well in chrome, firefox. However, in IE10, I have noticed that it gives me the error:
required anti-forgery cookie "__RequestVerificationToken" is not present
Definitely a cookie related issue as when I allow all cookies, it works fine. (ie. lowest privacy settings)
However, I have also noticed that when I go to GoDaddy and take off domain forwarding masking, (but leave the domain forwarding in) it also works fine.
Is there a way to get this working with the masking? (Masking is an option which allows forwarding of a domain while hiding the non-domain name. I am doing this because I am using Azure websites and would rather have my users see my actual domain name, not xxx.azurewebsites.net)
Thanks for any help here!
Domain forwarding masking works by hosting your real URL inside a frame. In that scenario, your real website content is coming from a different domain than the main page's domain. As such, any cookies your site tries to set will be interpreted as 'third party cookies' and are going to be blocked by any browser set to block those kinds of cookies (including, apparently, IE10 with its default settings).
Frankly, I think you are fighting a losing battle here. These kinds of cookies are benign in your use case, but they look exactly like the kinds of cookies advertisers are using to track people across websites, and so I would expect browsers to be even more hostile to them as time goes by.
I think your options in this case are to not need cookies (e.g. don't use the anti CSRF features provided by ASP.NET MVC), or to move your website to a host that allows you to directly serve your content at the real URL (so that you don't have to use the godaddy masking technique). The latter is probably the best long term solution.
We need to default URL to unique name. If it is www then with no prefix or vice versa. So decision to be made is either stick with www or with no prefix.
With no prefix cookie is set for all sub domains. What are other downsides for it? Or benefits?
Basically we need this for OpenID as OpenID will make users look different if they came from www or with no prefix.
As our site is new so we can go with either one. Also, how the domain name looks is not much of a concern.
You probably want to redirect (with a HTTP 301 - Permanent Redirect) one to the other anyway, since maintaining consistent urls is much easier that way. So whichever you decide, just make sure the actual authentication is done after the redirect, and users looking different won't be an issue.
That said, if you want www or not depends entirely on how other things in your appliction works. You mention that cookies for domain.com will be saved for all subdomains - is this something you want? Are you ever going to need to differentiate (for example, by allowing users to set up their own authentication systems for subdomains as a shared hosting service might do)?
If none of the differences you find between including and excluding www matter to your application, I'd go for not using www. The main reason for this is my picture of current trends on the internet - more and more applications (SO is an example of this) tend to leave the www out, both when linking to their own sites, and in marketing of different kinds.
However, the main point is make both work. You don't want your site to break because the user did(n't) type www at the beginning of the url.
By not using the www subdomain, you can suffer a performance hit when delivering static content, as noted here: http://developer.yahoo.com/performance/rules.html#cookie_free. As I understand it, if you use http://example.com/ and http://static.example.com for static content, any cookies you set on the main domain will be passed with requests to your static subdomain.
This can be avoided quite easily, by buying a distinct domain for static content. However, this can certainly be dealt with by using a www subdomain.
Then again, this is a very minor con, and really only comes into play when you're dealing with a high-demand site. (For example, Digg uses http://digg.com and http://*.diggstatic.com).
Ultimately, I would say that this is such a minor problem that it can probably be dealt with if performance starts to suffer. Don't optimize prematurely, and all that...
And, as #Tomas Lycken points out, make sure you account for www even if you don't use the subdomain.