I have a Firefox extension displaying its own HTML page via a chrome:// url, and scripts in it are running with chrome privileges. For users' security, I want to add a Content Security Policy to this page.
The obvious thing to do is to add it via <meta> tag, but that's not yet supported in Firefox (bug 663570). Update: Bug 663570 was fixed in Firefox 45, but my attempts to use a tag caused Firefox to crash. Bug 923902 seems to be the new bug to watch.
Is there any way to implement a CSP right now? Some way to fake the appropriate HTTP header for a chrome:// URL?
I asked this question waaay back in the days of XUL add-ons, which are long gone - in this era of WebExtensions, I could define a CSP in manifest.json, although the default CSP is plenty secure.
Related
I'm running Firefox 36.0.4 on Windows 7 32-bit. I've diabled all add-ons, extensions and user scripts before retesting this.
I'd like to step through JavaScript code that is served up in a <script> tag in the HTML document being produced by a Java (Tomcat) web server.
Unfortunately, when I select the HTML document under Debugger > Sources, the source of the page returns to the login page of the application - it appears that session information is not being used to request the source.
I stepped through the server-side code and found that the correct session cookie values were being sent for the real page request and some AJAX requests sent by the page. However, when I tried to load the page source in the JavaScript debugger, I found that an incorrect session cookie was being sent by the JavaScript debugger.
I can replicate this behaviour in other webapps, not just my own. For example, Stack Overflow:
Is this a configuration issue, or a bug in the Firefox Developer Tools?
I can't reproduce your problem using StackOverflow as an example, at least in Firefox Developer Edition ( currently version 38 ):
One thing that might help - try disabling the cache while the toolbox is open - this setting is in the developer tools setting panel ( click on the 'gear' icon at the top right of the toolbox ):
After reviewing canuckistani's answer, I downloaded Firefox Developer Edition. Seemingly, the problem was fixed.
Five minutes in, I became sick of being asked whether to remember passwords and having to manually clear session cookies (I prefer being able to do it by simply closing the browser) - it makes testing easier.
As per usual, I went to Options > Privacy > History to disable this behaviour, by setting the value to Never remember history.
Changing this setting requires the browser to restart. However, upon restarting, I once again saw the same erroneous behaviour - the wrong session cookie was being sent to the web application again.
The workaround here is to not use the Never remember history setting. I have filed a bug report at Mozilla.org Bugzilla.
The method for including scripts in my wordpress plugin is in another post: how to load jquery dialog in wordpress using wp_enqueue_script?
I think this works fine for me, but I'm getting a weird error in the Firefox development tools console when I load my page, after enqueueing the jquery-ui stuff (js and css). Here is my code:
wp_register_script( 'myplugin-jquery-ui', plugins_url("myplugin/js/jquery-ui.min.js" ) );
wp_enqueue_script( 'myplugin-jquery-ui');
But when I load the page in Firefox, the console says:
Cross-Origin Request Blocked: The Same Origin Policy disallows reading
the remote resource at
http://fonts.gstatic.com/s/opensans/v10/u-WUoqrET9fUeobQW7jkRT8E0i7KZn-EPnyo3HZu7kw.woff.
This can be fixed by moving the resource to the same domain or
enabling CORS.
I can't find "fonts.gstatic.com" referenced ANYWHERE in ANY of my files, least of all the jquery-ui.min.js file. Can you please help me understand a) why/how I'm getting this error, and b) if it's something I should just ignore?
And if I only need it for the dialog plugin, should I do be doing this differently?
This is a bug by Google. It's not serving the header correctly sometimes for reasons only they know. A bullet-proof way to prevent this shame is get the font files and serve them yourself.
You can verify the received headers when the woff is served and you will se how they are not sending the header when the browser fails to load the font. If you can't believe your browser, check with a network sniffer tool like Wireshark.
I need to point one page to another and then scroll to a particular section. In Chrome and Firefox, using an URL like www.example.com#section1 does the trick. (#section1 can be an anchor or an element's id).
However, in Safari, the hash disappears when I click the link.
Why is this happening? Is it possible to do it on Safari? If not, how can I get around this problem?
When using hyperlinks that point to inside sections on other pages you must remember to add a slash (/) before the hashtag for cross browser compatibility.
Eg: www.example.com/#item-1
doing www.example.com#item-1 isn't accepted by all browsers (apparently by Chrome and Firefox it is)
Case: I have parameters after '#' like url#myParam=123, when I changed params like
url#myParam=789 Safari sometimes loaded previous page based on myParam=123,
although in Address bar it showed myParam=789
Solution : Use url?#myParam=123, then Safari will everytime load new page.
Using '?' before '#' solved my problem.
I had a related problem with Safari (on iPhone/iOS) seemingly stripping off the hash/fragment when doing a:
var newHash = ...;
window.location.replace("#" + newHash);
The actually problem was a javascript error that only appeared on Safari. Since I could not easily assess a javascript console for the iPhone, I chose to download an old version of Window's Safari (related post, download).
Then, I could replicate the problem from the iPhone on my Window's desktop using the old version of Safari. At that point, I found a javascript tag had a missing ']'. This was a legitimate bug, but it was somehow ignored by Chrome, Firefox, and IE.
The window.location.replace() was not even being called, because the code was breaking out on the javascript error. So the problem wasn't that Safari was stripping the hashtag, even though it appeared that way from multi-browser testing.
I just experienced an issue like this. I was using a URL re-write in the asp.net web.config. With Safari, the hash and everything after was removed. After trying some of the things mentioned above I was still having problems. The issue for me was that this was all happening under HTTPS. Once I specified the full URL in the redirect and included the https:// scheme the redirect worked correctly and preserved the hash. Note this wasn't an issue with Chrome or Firefox.
For me it was the exact same issue like mrbinky3000 has stated above: The server mod-rewrite was killing the hash in safari.
The solution was to use a full absolute link like:
http://www.example.com/path/#item-1
I'm looking for a Chrome or Firefox extension that will auto validate your webpages from specific URLs.
I don't want to auto-check everything I browse, only the specified URL patters or domain names.
The validation doesn't have to be perfect, but has to be pretty fast and find things like
missing images, css or js files.
JavaScript warnings/errors.
invalid links (404 ones)
My firefox extension loads content from a 3rd party site into an overlay panel. This content is user generated and sometimes will, for instance, have an image tag that does not close which causes a mismatched tag error to be thrown and the extension fails. Is there any way I can sandbox this content so that these kind of errors are not an issue? I was thinking maybe load the content into a blank iframed page.. but was wondering if there might be a cleaner solution.
Unfortunately, unless you're getting back XML, there is no XPCOM solution for parsing. Your best bet is what you suggested - placing the content in an iframe.
You can find some more discussion about the topic at: http://www.mozdev.org/pipermail/greasemonkey/2005-April/001255.html
Your guess about an iframe was correct, there's no better way to do it (as of Firefox 3.5): Parsing HTML From Chrome on MDC