Iframing https site on http sites? - ruby-on-rails

I'm about to add an ssl certificate to a rails application hosted on heroku, so I can activate stripe payments.
The main function of our app is to let users create embeddable widgets. The widgets are essentially iframes of the views for the objects they're creating in our rails app.
The vast majority of our users' sites are using http, and I'm concerned that if we switch our app domain to https, the iframed widgets they've embedded would stop working.
Is it possible to have a secured domain name for our app, and let the users embed widgets that iframe parts of the app with source urls using just http?

You could only enforce SSL on certain routes.
scope constraints: { protocol: "https" } do
# routes you'd like secured
end
Then, don't enable force_ssl for the entire site and the other components should be unsecured.

An iframe that points to an http:// URL should just get redirected to https:// , so I forsee no problems there.
If you have a form with an action attribute as http://, that could be a problem, as a redirect won't take with it POST data.

Related

Create iFrame of my Rails app for select websites

I have a Rails app that I would like to have customers embed in their website (via an iFrame or similar code) where they can submit a form and potentially view account info.
I currently don't use OAuth, but I was wondering how to log them in safely given the strict same-origin and CORS settings most sites use to prevent clickjacking and such.
My initial thought was giving the iFrame a webpage with a designated token in the url to specify it comes from a valid site but that could easily be copy-pasted by hackers. I'm pretty sure OAuth tries to prevent that but as mentioned I don't have that currently implemented.
By default, Iframe options for rails is restricted to same-origin. If you want to enable to external sites you can do it like this
config.action_dispatch.default_headers = { 'X-Frame-Options' => 'ALLOWALL' }
This is will allow other sites to embed your site. If you want to restrict external sites. You can do that by adding following code it in your base controller.
response.headers["X-FRAME-OPTIONS"] = "ALLOW-FROM http://dummysite.com"
Keep this code in a method and call with before_action

Authenticate user before displaying an iFrame

I am preparing to work on a project where I need to display a dashboard from an online application. Unfortunately, the use of an API is currently not possible. The dashboard can be embedded in an iFrame. However, when it is displayed it will prompt the user viewing the dashboard to login to an account.
I have one paid account to this service. Are there any rails gems to login to the service before the iFrame is processed?
Or would a proxy within my rails app be a better route to go?
Any pointers are appreciated!
Neither a Rails gems nor a proxy within your rails will work and they same have the same limitation.
They are both running on the back-end, server side.
The authentication you need is client side.
Unless you mean proxy the ENTIRE thing, the auth request and all subsequent requests and user interactions with this dashboard. That should work but (see below)
The way authentication works (pretty much universally) is: once you log in to any system, it stores a cookie on your browser and then the browser sends that cookie for every subsequent request.
If you authenticate on the backend, that cookie will be sent to your rails code and will die there, and the users browser will never know about it.
Also - it is not possible to do the auth server side and capture the cookie and then have the user browse the site with their browser directly, for two reasons:
Sometimes auth cookies use information about the browser or HTTP client to encrypt the cookie, so sending the same cookie from a different client wont work
You can not tell a browser to send a cookie to a domain different than your own.
So your options are, off the top of my head right now:
If there is a login page that accepts form submissions from other domains, you could try to simulate a form submission directly to that sites "after login" page. (The page the user gets directed to once they fill up the login form). Any modern web framework as XSRF protection (Cross Site Request Forgery protection) and will disallow this approach for security reasons.
See if the auth this site uses has any kind of OAUTH, Single Sign On (SSO) or similar type of authentication integration that you can do. (Similar to an API, so you may have already explored this option)
Proxy all requests to this site through your server. You will have to rewrite the entire HTML so that all images, CSS, stylesheets, and all other assets are also routed through the proxy or else the URLs are rewritten in the HTML to not be relative. You might hit various walls if a site wasn't designed for this use case. From things like the site using relative URL's for assets that you aren't proxying, the site referencing non-relative URL's causing cross-domain errors, etc. Note its really hard to re-write every single last assets reference, its not only the HTML you're worried about, Javascript can have URL's in it too, and CSS can as well.
You could write a bookmarklet or a browser extension that logs the user into the site.
Have everyone install Lastpass
Have everyone install the TamperMonkey browser extension (and others like it for other browser), and write a small User Script to run custom javascript automatically to log the user in on that site
Scrape that site for the info you need and serve it on your own site.
OK I'm out of ideas. :)

Subdomain security in Rails

Let's say I'm trying to create an application called Blue. Blue is a Ruby on Rails application that turns the background of any website blue. It also allows users to log in and keep track of the websites they've visited and turned blue.
In order to turn a website's background websites blue, I've created a web proxy that inserts <link HREF="http://www.example.com/blue.css" type="text/css"> into the response's body. The proxy is implemented as a rack application and is be placed inside the Rails routes using the approach from the Rack in Rails 3 Railscast:
root :to => BlueProxy, :constraints => { :subdomain => "proxy" }
I'm very concerned about security with this approach. I know by default the domain for the cookies in my application would be .example.com. If the user typed in a malicious URL, the website could manipulate the user's account. I could fix this by only allowing the www subdomain for cookies in the application. However, I'd also like the proxy to be able to store cookies for the proxied site as well.
Here are my three questions:
Is this a bad approach? Is there a better way to solve this problem?
What's the best way to keep sibling subdomain cookies separate in Rails?
Are there any other security concerns I'm missing?
This approach is dangerous, and I caution you against running a proxy for several reasons:
It brings up a host of legal issues ranging from people accessing illegal content to your hosting content for your own benefit (and modifying it).
Your bandwidth (and hosting fees) will explode if the site gets popular.
Loading content inside an iframe has ux issues, like the browser back button not quite performing as the user wants it to.
Running a proxy opens up several more attack vectors to your site (e.g. sending a permalink to a malicious site proxied through your site) that you'll have to consider from a security perspective.
Instead of running an open proxy (okay, maybe it's not completely open, but how hard is it for someone to sign up?) on your back end, consider using a browser extension or greasemonkey script on the front end that can get its set of rules from your rails app and then add the stylesheet changes on the client side.

Http redirect to Https and SEO with asp.net mvc

I am wondering if I should keep the GET pages off SSL, while the POST of that same URL on SSL? I am reading issues with doing redirects of required pages that need SSL with SEO. ASP.NET MVC has an attribute that however does a 302 redirect instead of a 301. Is there a best practice in doing this with SEO. I have pages such as Login, Register, an Account page.
Typically, in terms of SEO, the principles don't apply to forms as the spider doesn't try to complete the forms, so the redirects from these pages doesn't affect them. It is just static links etc that they follow.
The main thing to worry about with SEO on a password protected site is ensuring that paid/private content doesn't end up in the search engine etc etc.
/* edit */
In terms of the redirect, I wouldn't use a flag on the controller to action this. If a site must be viewed via the https protocol, then I would set the Require SSL property on the directory in IIS and handle the 403.4 error specifically to issue a permanent redirect to the place the person was trying to access.
This will ensure any static pages that are accessed are pulled through the https protocol too.
Si

SSL-secured website best practices

I have a website (www.mydomain.com) that is secured with an SSL certificate. It is an ASP.NET website and I have forced certain pages via code to be required to use the https:// prefix. If they don't it will redirect them to the https:// equivalent. Is this a good practice? Is there an easier way to do this? Not every single page requires SSL.
Also, when the users use my URL in the form of mydomain.com instead of www.mydomain.com they get a certificate error because the certificate was registered for www.mydomain.com. Should I use the same approach as I am with the http:// and https:// issue I mentioned above? Or is there a better way of handling this?
Your approach sounds fine. In my current project, I force HTTPS when a user goes to my login page, (Based on a config flag which lets me test locally without dealing with needing a cert). This allows me to access other pages unsecured which is handy.
I have a couple places where our server grabs the output of other pages (rendering to html to PDF and fetching dynamic images for example). Because of our environment, our server can't resolve it's public name, so if we were to force ssl at the site we'd have to add, our internal IP address (or fake the domain name).
As for your second question you have two options to handle the www.example.com vs example.com. You can buy a certificate that allows you to have multiple domain names. These are known as UCC certificates.
Your second option is to redirect example.com to www.example.com or the other way around. Redirecting is a great option if want your content to be indexed by google or other search engines. Since they will see www.example.com and example.com as two seperate sites. This means that links to your sites will be split reducing your overall page rank.
You can configure sites in IIS to require a Cert but that would A) generate an error if someone isn't visiting with https and B) require all pages to use https. So, that won't work. You could put a filter on IIS that checks all requests and redirects them as https calls if they are on your encryption list. The obvious drawback here is the need to update your list of pages every time a new page is added (e.g. from an XML file or database) and restart the filter.
I think that you are probably correct in building code into the pages that require https that redirects to an https version if they arrive via http. As far as your cert error goes, you could redirect with a full path (that includes the www) instead of a relative path to fix this problem. If you have any questions about how to detect whether the call uses https OR how to get the full path of the current request please let me know. Both are pretty straightforward but I've got sample code if you need it.
UPDATE - Josh, the certs that handle multiple subdomains are called wildcard certs. The problem is that they are quite a bit more expensive than standard certs.
UPDATE 2: One other thing to consider is to use a Master page or derived class for the pages that need SSL. That way, instead of duplicating the code in each page you can just declare it as type SSLPage (or use the corresponding Master page) and have the Master/Parent class handle the redirect. Again, you'll need to do some URL processing if you take this approach but it is pretty trivial.
Following is something that can help you:
If it is fine to display all your website pages with https:// then you can simply update your code to use https:// and set two bindings in IIS. One is for http and another is for https. In this way, your website can be accessible through any of the protocol.
Your visitors are receiving a name mismatch error because the common name used in your SSL certificate is www.mydomain.com. Namecheap is providing RapidSSL certificates through which you can secure both names under single SSL. You can purchase this SSL for www.mydomain.com and it will automatically secure mydomain.com (i.e. without www).
Another option is you can write a code to redirect your visitors to www.mydomain.com website even if they browse mydomain.com.

Resources