I was testing a site with cookies disabled and I noticed HHOJSID parameter in URL path.
It could be a session ID encoded in URL like jsessionid for J2EE web applications.
I searched Google and found a lot of examples but no technical description.
Since this parameter seems to appear exclusively in URLs for HP's Home and Home Office online store, I suggest it stands for "Home and Home Office Java Session ID".
I'm sure you're right when you say it's a URL-based session ID - it's just that HP have modified its name for their online store.
Related
I am working on Asp.Net MVC 5. When i click a link (placed in another website) I navigate to UserDetails.cshtml page. Basically that 3rd party site is passing the UserName & Password to my site & using that I authorize & display further user info.
It's fine but the Url is looking like this
localhost:8080//Admin/UserDetails/UserName/PWD.
I don't want to show the UserName & Password in URL i.e URL should look something like :
localhost:8080//Admin/UserDetails/
One possible solution could be rewrite the URL in IIS (http://www.hanselman.com/blog/ASPNETMVCAndTheNewIIS7RewriteModule.aspx)
But I believe there is an easier way to handle this by using the routing mechanism of MVC.
Please help me to figure out the same.
EDIT :
As many of you are confused why I am not doing a Form Post here, let me re-frame my question. I have no control over the third party application, so I cant request them to do a form Post to my MVC application. Again the 3rd party application is a Oracle Reporting application (OBI), so doing a POST from that application might not be feasible too...
Let me reverse engineer your requirements from your question:
I want to have an URI that when invoked will give access to a secured section of my website. This URI must be clicked by visitors of a third-party site, whom I give that URI to. I want to hide the credentials from the URI.
You cannot do this, the requirements are conflicting. You cannot hand out URIs that will authenticate anyone who fires a request to that URI.
You could do something with a token (like http://your-site/auth/$token), but then still, anyone with access to that URI can use it to authenticate themselves, or simply put it up on their own website.
If you have data you want to expose to a third-party site, let that site perform an HTTP request (with tokens, usernames, headers or whatever you want to use to authenticate) in the background to your site, and display the response in their site. Then the visitor won't see that traffic, can't share the URI and all will be secure.
No. No. NO. Like seriously, NO. Any sensitive information should be sent via a post body over a secure connection (HTTPS). You can't "hide" information in a GET request, because it's all part of the URI, or the location of a particular resource. If you remove a portion, it's an entirely different location.
UPDATE
I find it extremely hard to believe that any third-party application that needs to authenticate via HTTP and isn't designed by a chimp with a typewriter, wouldn't support a secure method to do so, especially if it's an Oracle application. I'm not familiar with this particular app, but, and no offense meant here, but I would more easily believe that you've missed something in the documentation or simply haven't found the right way to do it yet before I'd believe you have to send clear-text credentials over GET.
Regardless, as I said previously, there's no way to hide information in a GET request. All data in a GET is part of the URL, and therefore is plainly visible in the browser location bar or whatever. Unfortunately, I have no advice for you other than to look closer at the documentation, even reach out to Oracle if you have to. Whether by post or something like OAuth, there almost has to be another way.
I'm working on a ecommerce site which uses both Data Insertion Api and javascript (AppMeasurment.js) to send data to Adobe collecting servers. I need to read the s_vi cookie value in order to send data from backend.
When I look a the requests in firefox, the s_vi cookie has a different domain than my domain (I'm testing on localhost), so I can't read it.
Any help is appreciated.
The s_vi cookie is set in a response from your Data Collection Server (e.g. 'metrics.yoursite.com'), so you can only see that cookie in a matching domain space (e.g. 'yoursite.com'.)
To test on localhost, you could try using Fiddler to map requests for 'yoursite.com' to your localhost (or machine name) so your browser will send the cookie with those requests.
By default, Adobe Analytics is implemented with 3rd party cookies, but because of the Same-Origin Policy, javascript can only read cookies that are set on the same domain as the page.
If you already have your own system in place for tracking visitors by an id, you can explicitly set s.visitorID and it will override the default id. If you go this route, then you don't need to read the cookie, as you already have the value exposed.
Alternatively, you can implement the Visitor ID Service which is a cross-domain 1st party cookie solution (Note: I have found that it does not work 100% cross-domain though, depending on how strict a visitor's browser settings are, particularly in IE). Because this is a first party cookie solution, you will then be able to read the cookie with javascript.
I wonder if there's a way to tell TYPO3 to share the sessions / cookies between different domains?
We wrote an Extbase extension on a multi language / multi domain site.
We store search words from a search form in the user session. If the user switches the page language, he should get the same results as before - Without the need to re-fill the search form.
One way would be to tell the browser to store several cookies at the same time - one for each domain/language. How can this be achieved with TYPO3 / Extbase?
By default, there is no way to set cookies for a different domain - not with or without TYPO3. This is a security measure implemented in every browser (or do you want me to set / read your cookies from yourbank.com when you visit my web site? ;-))
You have to create some helper script that does this for you. One way could be:
example.com is loaded
this page includes an iframe to a PHP script (or TYPO3 site, e.g. with eID) on example.org with a GET parameter storing being the session id
the script loaded from example.org reads the GET parameter and sets a cookie with that session id (or whatever parameter you want to transfer).
afterwards the cookie is also available when browsing example.org
I have never tried this, but I'm pretty sure it will work with PHP. Maybe it's even possible with pure JavaScript, but I'm not so sure. In every case, think about what security holes you get with the explained script. In doubt sign the parameters (or require a token)!
One of my sites is for old mobile phones that don't accept cookies so it uses a URL-based Session ID.
However, Google is indexing the Session ID, so when my site is searched on Google, all the results come up with a specific Session ID.
On most occasions, that Session ID is no longer valid by the time a guest clicks on it, but I've had at least one case where a guest clicked on a link from Google and it actually logged them into someone else's account, which is obviously a huge security flaw.
So how can I keep Google from indexing the Session ID in my URL's? In case it helps, the Session ID has always been set to "Representative URL" in Google's Webmaster Tools.
You can do this by placing a robots.txt file in your root web directory to tell Googlebot and all other crawlers not to crawl URLs with that attribute.
Here is an example:
Lets say the URL you want to block is in the form of:
http://www.mywebsite.com/page.html?id=1234
The robots.txt syntax to block URLs with the id attribute is:
User-agent: *
Disallow: /*id
You can find out more about robots.txt at http://www.robotstxt.org
Read more about this at http://www.seochat.com/c/a/Search-Engine-Optimization-Help/Preventing-Duplicate-Content-on-an-ECommerce-Site-from-Session-IDs/1/
Check this out, https://developers.google.com/search/docs/advanced/crawling/consolidate-duplicate-urls, you can set canonical urls and google-bot will use this url to crawl your webpage, this can also solve duplicate url issues for the same webpage.
What are the pros and cons of these url formats for a website that does mobile and desktop content...
mobile.example.com
example.com/mobile
no explicit url, but send back dynamic content based on browser, or querystring variable?
thanks
W3 recommends "When accessing site entry points users should not have to enter a filename as part of the URI. If possible, configure Web sites so that they can be accessed without having to specify a sub-domain as part of the URI."
So m.example.com or example.com/m would be best solution