I'm not getting any errors or problems storing cookies on chrome or firefox, but with the default cookie settings on Safari...i seem to get quite a bit of problem with getting cookies to store properly.
I'm doing cross site (rendering an iframe from another domain) so i'm not sure if there is something special i should be doing for this on Safari.
http://prntscr.com/e1yt22
This setting will always make my website work properly and store cookies.
http://prntscr.com/e1ytfc
This setting will however not.
EDIT 1: Someone said to do this
The solution is to access the API/service from a sub-domain, e.g.
“api.somedomain.com”. This should cause Safari to hold onto the cookie
so it can be re-used for the CORS requests.
Will the workaround with subdomain work if both domains are different?
Will I get trouble with cookies if I do this on safari, which doesn't allow 3rd party cookies by default?
example: <iframe src="xyz.example.com"></iframe> inside xyz.com website ?
Or does it have to be like this:
<iframe src="xyz.example.com"></iframe> inside example.com website?
Related
I am preparing to work on a project where I need to display a dashboard from an online application. Unfortunately, the use of an API is currently not possible. The dashboard can be embedded in an iFrame. However, when it is displayed it will prompt the user viewing the dashboard to login to an account.
I have one paid account to this service. Are there any rails gems to login to the service before the iFrame is processed?
Or would a proxy within my rails app be a better route to go?
Any pointers are appreciated!
Neither a Rails gems nor a proxy within your rails will work and they same have the same limitation.
They are both running on the back-end, server side.
The authentication you need is client side.
Unless you mean proxy the ENTIRE thing, the auth request and all subsequent requests and user interactions with this dashboard. That should work but (see below)
The way authentication works (pretty much universally) is: once you log in to any system, it stores a cookie on your browser and then the browser sends that cookie for every subsequent request.
If you authenticate on the backend, that cookie will be sent to your rails code and will die there, and the users browser will never know about it.
Also - it is not possible to do the auth server side and capture the cookie and then have the user browse the site with their browser directly, for two reasons:
Sometimes auth cookies use information about the browser or HTTP client to encrypt the cookie, so sending the same cookie from a different client wont work
You can not tell a browser to send a cookie to a domain different than your own.
So your options are, off the top of my head right now:
If there is a login page that accepts form submissions from other domains, you could try to simulate a form submission directly to that sites "after login" page. (The page the user gets directed to once they fill up the login form). Any modern web framework as XSRF protection (Cross Site Request Forgery protection) and will disallow this approach for security reasons.
See if the auth this site uses has any kind of OAUTH, Single Sign On (SSO) or similar type of authentication integration that you can do. (Similar to an API, so you may have already explored this option)
Proxy all requests to this site through your server. You will have to rewrite the entire HTML so that all images, CSS, stylesheets, and all other assets are also routed through the proxy or else the URLs are rewritten in the HTML to not be relative. You might hit various walls if a site wasn't designed for this use case. From things like the site using relative URL's for assets that you aren't proxying, the site referencing non-relative URL's causing cross-domain errors, etc. Note its really hard to re-write every single last assets reference, its not only the HTML you're worried about, Javascript can have URL's in it too, and CSS can as well.
You could write a bookmarklet or a browser extension that logs the user into the site.
Have everyone install Lastpass
Have everyone install the TamperMonkey browser extension (and others like it for other browser), and write a small User Script to run custom javascript automatically to log the user in on that site
Scrape that site for the info you need and serve it on your own site.
OK I'm out of ideas. :)
I am helping to create a Rails app that uses Ember for a front end MVC. For the app, it is hosting user content accessed via subdomains. On the subdomains, the user can upload custom JS and CSS. What I'm wondering about is if token authentication on the root domain will be safe if stored in Ember from the custom JS people could upload and run on their subdomains?
Provided the following:
Don't use cookies on *.domain.com or use cookies at all.
They can't run (or really display it unescaped in any way) the JS/CSS on your main site.
The ember app with your token doesn't run on their sub-domain (obviously).
They can't put HTML in a file with a different extension or even Content-Type on your subdomain (or you aren't using cookies). They could direct a user's web browser there and it'd display the HTML. Be wary of phishing though (looks like it's your secure content). I can't imagine you could prevent this easily other than not using cookies -- without 100% ensuring properly formatted JS/CSS which would present all kinds of problems.
You can limit cookies to domain.com and www.domain.com, but I don't recommend it (prone to mistakes). If you don't somebody can make a GET request through CSS or ie. an image tag (not to mention JavaScript) and it'll send the authenticated cookies to your server. Remember unescaped input in their app can leave holes too.
If your token is stored in ember, and they have access to custom JS where the app is running of course it'll leave your token vulnerable. If you run your ember app only on the www.domain.com, avoid cookies, storing the token only locally/in JS, you might be okay.
If they just put HTML code in a file with another extension and direct people there it'll be interpreted as HTML.
My MVC site uses the antiforgeryToken code, which works well in chrome, firefox. However, in IE10, I have noticed that it gives me the error:
required anti-forgery cookie "__RequestVerificationToken" is not present
Definitely a cookie related issue as when I allow all cookies, it works fine. (ie. lowest privacy settings)
However, I have also noticed that when I go to GoDaddy and take off domain forwarding masking, (but leave the domain forwarding in) it also works fine.
Is there a way to get this working with the masking? (Masking is an option which allows forwarding of a domain while hiding the non-domain name. I am doing this because I am using Azure websites and would rather have my users see my actual domain name, not xxx.azurewebsites.net)
Thanks for any help here!
Domain forwarding masking works by hosting your real URL inside a frame. In that scenario, your real website content is coming from a different domain than the main page's domain. As such, any cookies your site tries to set will be interpreted as 'third party cookies' and are going to be blocked by any browser set to block those kinds of cookies (including, apparently, IE10 with its default settings).
Frankly, I think you are fighting a losing battle here. These kinds of cookies are benign in your use case, but they look exactly like the kinds of cookies advertisers are using to track people across websites, and so I would expect browsers to be even more hostile to them as time goes by.
I think your options in this case are to not need cookies (e.g. don't use the anti CSRF features provided by ASP.NET MVC), or to move your website to a host that allows you to directly serve your content at the real URL (so that you don't have to use the godaddy masking technique). The latter is probably the best long term solution.
Let's say I'm trying to create an application called Blue. Blue is a Ruby on Rails application that turns the background of any website blue. It also allows users to log in and keep track of the websites they've visited and turned blue.
In order to turn a website's background websites blue, I've created a web proxy that inserts <link HREF="http://www.example.com/blue.css" type="text/css"> into the response's body. The proxy is implemented as a rack application and is be placed inside the Rails routes using the approach from the Rack in Rails 3 Railscast:
root :to => BlueProxy, :constraints => { :subdomain => "proxy" }
I'm very concerned about security with this approach. I know by default the domain for the cookies in my application would be .example.com. If the user typed in a malicious URL, the website could manipulate the user's account. I could fix this by only allowing the www subdomain for cookies in the application. However, I'd also like the proxy to be able to store cookies for the proxied site as well.
Here are my three questions:
Is this a bad approach? Is there a better way to solve this problem?
What's the best way to keep sibling subdomain cookies separate in Rails?
Are there any other security concerns I'm missing?
This approach is dangerous, and I caution you against running a proxy for several reasons:
It brings up a host of legal issues ranging from people accessing illegal content to your hosting content for your own benefit (and modifying it).
Your bandwidth (and hosting fees) will explode if the site gets popular.
Loading content inside an iframe has ux issues, like the browser back button not quite performing as the user wants it to.
Running a proxy opens up several more attack vectors to your site (e.g. sending a permalink to a malicious site proxied through your site) that you'll have to consider from a security perspective.
Instead of running an open proxy (okay, maybe it's not completely open, but how hard is it for someone to sign up?) on your back end, consider using a browser extension or greasemonkey script on the front end that can get its set of rules from your rails app and then add the stylesheet changes on the client side.
I'm an old hand at C but a raw newbie at Java/Tomcat.
I'm fine with Tomcat session management in http alone. Its when I've come to look at switching to https that I've had problems.
I gather for Tomcat that you have to start with an http session if you want to maintain a session as you switch from http to https and back to http. This works fine for me when the browser is enabled for cookies.
But when the browser is disabled for cookies (and URL rewriting is being used) then switching http to https or back again causes a fresh session to be started each time. I'm assuming this is a security thing.
Q1 - Is it possible/desirable to maintain a session between http and https using URL rewriting?
Q2 - If it isnt possible then what do e-commerce developers do about non-cookie users?
I dont want to prevent non-cookie people using my site. I do want some flexibility switching between http and https.
thanks for any help,
Steven.
It doesn't seem desirable to maintain session between HTTP and HTTPS using the same cookie or URL token.
Imagine the case where you're user is logged on, with a given cookie (or URL token) passed back and forth for every request/response in an e-commerce website. If someone in the middle is able to read that cookie, he can then log on to the HTTP or HTTPS variant of the site with it. Even if whatever the legitimate user is then doing is over HTTPS, the attacker will still be able to access that session (because he too will have the legitimate cookie). He could see pages like the cart, the payment method, perhaps change the delivery address.
It makes sense to pass some form of token between the HTTP session and the HTTPS session (if you're using sessions), but treating them as one and the same would cause some vulnerability. Creating a one-off token in the query parameter just the transition could be a solution. You should however treat them as two separate authenticated sessions.
This vulnerability can happen sometimes with websites that use mixed HTTP and HTTPS content (certain browsers such as Firefox will give you a warning when that happens, although most people tend to disable it the first time it pops up). You could have your HTTPS session cookie for the main page, but that page contains images for the company logo, over plain HTTP. Unfortunately, the browser would send the cookie for both (so the attacker would be able the cookie then). I've seen it happen, even if the image in question wasn't even there (the browser would send the request with the cookie to the server, even if it returned a 404 not found).