Gitlab Pages and cookies - gitlab-pages

I am thinking about hosting my static website using gitlab pages.
The website is completely static, no dynamic contents (besides me updating it from time to time).
Now I am wondering: Since I am in the EU, I have to add a cookie Banner if applicable. Looking at my websites request headers, I can see the following cookie:
gitlab-pages=MTY3MzI2MDgwMXxSMGFLVnNPYVk2X1F3WTlDeVhkeENLMGw5blVBNDItTHRwNDlEQVJ1b2p1ZUMwVExRaUMwOF9zcXl0UElic21JbDFUajlzdl9tUkJJNm05U29yMnRBUFNEV1pvMGhycDJuLXZEMUV2bGN5cEhneWRrdC1EaDI2aEI4QXByR3YyYXl6OVZKSEJ0VnFveTJzdFNwU1NOdGFlOFZnLWFfMHRuUGEzd0s3QWd4aXhLOUMtcXlyQ3QwVExXRURSZEFKa0JWNWdiS0MzbmREQWR0MHlPdGVXTmZGSzN2cV9zaWZ6c0NtOWZiNXFIZ2RpMHlHeGI1N2RCT3UzRU42WDRpZGR4YjlfeTdYcm1TODVPVW5WYzZqdmZ4YU5mZWhsWFA3N0I4Zm9uME5aR3B1ellhYTBadXpqeGt1a0Q3YWlYN2FkQzloMzRNX2ZQQmhCMXQxa0NvQnh4a1BGWTlPdE5MOXVLdHl5T0oxRkcxNzBQTHZ0eFpFMW9QeTlHNTZoUHhlemxUUW4yaDJZRVZsNkF8uBra7CQZU8byJzld_UbWI8oEm_Rl-pfIPpynNRNoM1U=
I am wondering:
What does this cookie do?
Can I disable it somehow?
If I cannot disable it, do I have to make a cookie banner?

Related

Storing cross site cookies on Safari using Heroku + Rails

I'm not getting any errors or problems storing cookies on chrome or firefox, but with the default cookie settings on Safari...i seem to get quite a bit of problem with getting cookies to store properly.
I'm doing cross site (rendering an iframe from another domain) so i'm not sure if there is something special i should be doing for this on Safari.
http://prntscr.com/e1yt22
This setting will always make my website work properly and store cookies.
http://prntscr.com/e1ytfc
This setting will however not.
EDIT 1: Someone said to do this
The solution is to access the API/service from a sub-domain, e.g.
“api.somedomain.com”. This should cause Safari to hold onto the cookie
so it can be re-used for the CORS requests.
Will the workaround with subdomain work if both domains are different?
Will I get trouble with cookies if I do this on safari, which doesn't allow 3rd party cookies by default?
example: <iframe src="xyz.example.com"></iframe> inside xyz.com website ?
Or does it have to be like this:
<iframe src="xyz.example.com"></iframe> inside example.com website?

Authenticate user before displaying an iFrame

I am preparing to work on a project where I need to display a dashboard from an online application. Unfortunately, the use of an API is currently not possible. The dashboard can be embedded in an iFrame. However, when it is displayed it will prompt the user viewing the dashboard to login to an account.
I have one paid account to this service. Are there any rails gems to login to the service before the iFrame is processed?
Or would a proxy within my rails app be a better route to go?
Any pointers are appreciated!
Neither a Rails gems nor a proxy within your rails will work and they same have the same limitation.
They are both running on the back-end, server side.
The authentication you need is client side.
Unless you mean proxy the ENTIRE thing, the auth request and all subsequent requests and user interactions with this dashboard. That should work but (see below)
The way authentication works (pretty much universally) is: once you log in to any system, it stores a cookie on your browser and then the browser sends that cookie for every subsequent request.
If you authenticate on the backend, that cookie will be sent to your rails code and will die there, and the users browser will never know about it.
Also - it is not possible to do the auth server side and capture the cookie and then have the user browse the site with their browser directly, for two reasons:
Sometimes auth cookies use information about the browser or HTTP client to encrypt the cookie, so sending the same cookie from a different client wont work
You can not tell a browser to send a cookie to a domain different than your own.
So your options are, off the top of my head right now:
If there is a login page that accepts form submissions from other domains, you could try to simulate a form submission directly to that sites "after login" page. (The page the user gets directed to once they fill up the login form). Any modern web framework as XSRF protection (Cross Site Request Forgery protection) and will disallow this approach for security reasons.
See if the auth this site uses has any kind of OAUTH, Single Sign On (SSO) or similar type of authentication integration that you can do. (Similar to an API, so you may have already explored this option)
Proxy all requests to this site through your server. You will have to rewrite the entire HTML so that all images, CSS, stylesheets, and all other assets are also routed through the proxy or else the URLs are rewritten in the HTML to not be relative. You might hit various walls if a site wasn't designed for this use case. From things like the site using relative URL's for assets that you aren't proxying, the site referencing non-relative URL's causing cross-domain errors, etc. Note its really hard to re-write every single last assets reference, its not only the HTML you're worried about, Javascript can have URL's in it too, and CSS can as well.
You could write a bookmarklet or a browser extension that logs the user into the site.
Have everyone install Lastpass
Have everyone install the TamperMonkey browser extension (and others like it for other browser), and write a small User Script to run custom javascript automatically to log the user in on that site
Scrape that site for the info you need and serve it on your own site.
OK I'm out of ideas. :)

Adobe Analytics - Different domain for s_vi cookie

I'm working on a ecommerce site which uses both Data Insertion Api and javascript (AppMeasurment.js) to send data to Adobe collecting servers. I need to read the s_vi cookie value in order to send data from backend.
When I look a the requests in firefox, the s_vi cookie has a different domain than my domain (I'm testing on localhost), so I can't read it.
Any help is appreciated.
The s_vi cookie is set in a response from your Data Collection Server (e.g. 'metrics.yoursite.com'), so you can only see that cookie in a matching domain space (e.g. 'yoursite.com'.)
To test on localhost, you could try using Fiddler to map requests for 'yoursite.com' to your localhost (or machine name) so your browser will send the cookie with those requests.
By default, Adobe Analytics is implemented with 3rd party cookies, but because of the Same-Origin Policy, javascript can only read cookies that are set on the same domain as the page.
If you already have your own system in place for tracking visitors by an id, you can explicitly set s.visitorID and it will override the default id. If you go this route, then you don't need to read the cookie, as you already have the value exposed.
Alternatively, you can implement the Visitor ID Service which is a cross-domain 1st party cookie solution (Note: I have found that it does not work 100% cross-domain though, depending on how strict a visitor's browser settings are, particularly in IE). Because this is a first party cookie solution, you will then be able to read the cookie with javascript.

Using akamai caching in rails app - don't cache logged in users

I am attempting to use akamai in my production app to cache basically every page when you are logged out, as only a small percentage of our users have accounts. However I want to be able to serve logged in users a none cached version of the page.
It seems that I may be able to do this in the controller with something like:
headers['Edge-control'] = "no-cache, no-store"
Will this work? Is there a better way to handle this, perhaps from a lower level, like Rack? I am having a lot of trouble finding standard practices.
Thanks!
I just dealt with this situation with akamai and wordpress. Even if akamai honors headers, it's probably more robust to base the rule on a cookie, the same cookie you use to track the login. That way, caching is tied to something visible -- if the cookie is not present, the user is not logged in. The header-based solution will be more prone to silent failures and would require more effort to validate for correct behavior.
This doesn't work because Akamai doesn't look at response header. You can use cookies to do it though.
Yes, you can in fact do this with headers.
Just send Edge-Control: no-store
Akamai does in fact examine response headers...how else could they honor cache-control headers from origin...which is a very common configuration setting.
As user3995360 states, you're better off using cookies to tell Akamai not to cache the results for a number of reasons:
If Akamai has a cached version of your page, the logged in user will be served that - your server won't have a chance to send a different header.
There's nothing to tell Akamai why the header is different for some requests - if your logged in user managed to get the no-store header, and then an anonymous user caches the page, you're back to point 1.
That being said, when I've done this in past we've had to involve the Akamai Consultants to enable this feature on our setup.

Preparing my ASP.NET / MVC site to use SSL?

I'm getting ready to have an SSL cert installed on my hosting.
It is my understanding that (and correct me if I'm wrong...):
Once the hosting guys install the cert, I will be able to browse my site on Http or Https (nothing will stop me from continuing to use Http)?
The only thing I need to do, is add logic (in the case of MVC, Controller attributes/filters) to force certain pages, of my choosing, to redirect to Https (for instance, adding a [RequiresHttps] attribute sparingly).
Do I have to worry about doing anything extra with these things to make sure I'm using SSL properly? I'm not sure if I need to change something with logic having to do with:
Cookies
PayPal Express integration
Also, I plan on adding [RequiresHttps] only on the shopping cart, checkout, login, account, and administration pages. I wish to leave my product browsing/shopping pages on Http since I heard there is more overhead for using Https. Is this normal/acceptable/ok?
One more question... I know ASP.NET stores some login information in the form of an Auth cookie. It is okay that a user logs in within an Https page, but then can go back and browse in an Http page? I'm wondering if that creates a security weakness since the user is logged in and browsing in Http again. Does that ruin the point of using SSL?
I'm kind of a newb at this... so help would be appreciated.
Starting with your questions, on one, (1) yes nothing will stop you to use for the same pages http ether https.
and (2) Yes you need to add your logic on what page will be show only as https and what as http. If some one wondering, why not show all as https the reason is the speed, when you send them as https the page are bigger and the encode/decode is take a little bit more, so if you do not need https, just switch it to http.
Switching Between HTTP and HTTPS Automatically is a very good code to use for the implementation of switching logic fast and easy.
Cookies
When the cookie have to do with the credential of the user then you need to force it to be transmitted only with secure page. What this mean, mean that if you set a cookie with https, this cookie is NOT transmitted on non secure page, so is stay secure and a man in the middle can not steal it. The tip here is that this cookie can not be read on http pages - so you can know that the user is A, or B only on secure page.
Cart - Products
Yes this is normal : to leave the products and the cart on unsecured connection because the information is not so special. You start the https page when you be on user real data, like name, email, address etc.
Auth cookie
If you set it as secure only, then this cookies not show/read/exist on unsecured page. It is an issue if you not make it secure only.
Response.Cookies[s].Secure = true;
Few more words
What we do with secure and non secure page is that we actually split the user data in two parts. One that is secure and one that is not. So we use actually two cookies, one secure and one not secure.
The not secure cookie is for example the one that connect all the products on the cart, or maybe the history of the user (what products see) This is also that we do not actually care if some one get it because even a proxy can see from the url the user history, or what user see.
The secure cookie is the authentication, that keep some critical information for the user. So the non secure cookie is with the user everywhere on the pages, the secure is only on check out, on logged in, etc.
Related
MSDN, How To: Protect Forms Authentication in ASP.NET 2.0
Setting up SSL page only on login page
Can some hacker steal the cookie from a user and login with that name on a web site?
1) Yes, you are right.
2) Yes. You can optionally handle HTTP 403.4 code (SSL required) more gracefully, by automatically redirecting the client to the HTTPS version of the page.
As for authentication cookies, I've found this MSDN article for you. Basically, you can set up your website (and the client's browser) to only transmit authentication cookie via HTTPS. This way it won't be subject to network snooping over unencrypted channel.
Of course, this is only possible if all of your [Authorize] actions are HTTPS-only.

Resources