I am using the progressive web app module from Drupal and it loads the file from /pwa/serviceworker.js
Reading this article it says you need to load the Service worker from root https://frustrated.blog/2016/07/17/pwa_step_one.html and
https://github.com/GoogleChromeLabs/sw-toolbox/issues/158
Can I load the service worker from anywhere? say modules/pwa/js/serviceworker.js ?
It seems to be working that way
If you load a service worker in /modules/pwa/js, it can only manage resources under this path. So, you have to place your SW at the root of your public path. A solution is to use url rewrite with an htaccess file for example.
Related
I've got an app on Heroku that contains uglified JS code. I'd like to include my original sources (.js) in my public folder so that I can refer to them from a source map for debugging. I don't want the source files to be viewable by just anyone, however: I'd like to restrict access to a certain set of IPs.
In other words, in my Rails app on Heroku I'd like to have a file here:
myapp.herokuapp.com/unminified_sources/my_file.js
And I'd like to restrict access to this file to a certain IP (mine).
Is this possible on Heroku? How? Can I use an .htaccess file?
You can put the file behind a unminified_sources_controller my_file action that responds to js requests and restrict it that way. You can restrict in the route or add a before_filter to test for IPs.
You can stream the file. http://guides.rubyonrails.org/action_controller_overview.html#sending-files
I have changed all the URLs of my website. (Domain is the same. For example: http://www.example.com/category/sample ----> http://www.example.com/Category/Sample)
Now it seems to have lots of 404 pages that are effecting my SEO.
What should I do to solve this problem? Any suggestion?
Thank you
You can proceed with changing the Context root for your website. Context roots determine the URL of any web-application.
Click here for a short article for making changes to context-roots.
The process may change based on the Server you are using.
Just create sitemap.xml file. There are many online sites available that will create free sitemap.xml file. Just you have to submit your website url and within few seconds they will generate sitemap.xml file. Download this sitemap.xml file and place it into your root directory. When crawler run through your website it will automatically update your all links and within few days you will see all updated links are present in search resulting page like google search engine.
Note: Also, dont forget to update sitemap.xml file path in robot.txt file.
When we setup laravel, the default url will have "public" on it.
Example : localhost/laravel/public/users
We know that we always wanted to remove the "public" in our url specially when we go to production.
I searched in the internet and looks like removing the "public" url needs some work to achieve like moving files to other folder, modifying .htaccess, etc...
My question is, whats the purpose of "public" url?
Why doesnt laravel provide an easy way of removing the "public" url..
I know Laravel is popular these days but why give beginners a headache in just removing a part of the url?
Let me know your thoughts
The public directory holds all your files that should be accessible from the outside.
This way nobody can access any other file in your application.
How do you remove the public in your URL? Well you don't "remove" it at all. Instead you should point your domain (virtual host) directly at public (this is called Document Root)
If you don't have the possibility to do that (usually because your on a shared hosting with very restricted permissions) then and only then you need to move files or use a workaround with .htaccess. However if your hoster doesn't allow you to set the document root I would consider to switch to another one...
By the way: This is not really unique to Laravel. A lot of PHP frameworks do it (Zend Framework, Symfony, etc) all have a dedicated directory for files that should be accessible - separated from the framework core and your application.
If you're having trouble setting up your development environment correctly, you should try out Laravel Homestead. It's a pre-configured Vagrant box that makes it very easy to get your site locally up and running (also comes without public in your URL!)
When you make your site's root point to the public folder, e.g. www.mysite.com points to /path/to/laravel/public, there are a few advantages.
Your files outside of public, like your .env with your passwords cannot be accessed doing things like www.mysite.com/../.env and other common "exploits" are prevented just by taking this simple approach.
It's quite a common pratice in other frameworks too, not only Laravel or PHP.
Basically, this is what my app does:
It sends an AJAX request
The server creates a file
The server sends back the URL of the
file location
The client-side will attempt to
create a dialog to download the file
at that location (probably using a
frame? I haven't got this far yet).
My question is, how do I dynamically route to the files I create so that they are accessible when you browse to them? If I don't add a route for them, then they will get a 404 if they try and access the directory they're in.
The files are currently stored in a folder in public.
Would the best way to deal with this make the folder somehow not require a route, so that it can be browsed to directly, and then have an index page on it so they can't view the full list of files? If so, please let me know how I can accomplish this. And on a side note, if you have an idea of how I can accomplish JS displaying the download dialog let me know.
It's Rails 3 by the way.
Thanks!
For a full private set of files: choose a place for your files outside your public directory, then configure X-SendFile support in your web server and finally use send_file in your rails application.
I have an ASP.NET MVC website that works in tandem with a Windows Service that processes file uploads. For easy maintenance of the site, I'd like the log file for the Windows Service to be accessible (to me, only) via the website, so that I can hit http://myserver/logs/myservice to view the contents of the log file. How can I do that?
At a guess, I could either have the service write its log file in a "Logs" folder at the top level of the site, or I could leave it where it is and set up a virtual directory to point to it. Which of these is better - or is there another, better way?
Wherever the file is stored, I can see that there's going to be another problem. I tried out the first option (Logs folder in my website), but when I try to access the file via HTTP I get an error:
The process cannot access the file 'foo' because it is being used by another process.
Now, I know from experience that my service keeps the file locked for writing while it's running, but that I can still open the file in Notepad to view the current contents. (I'm surprised that IIS insists on write access, if that's what's happening).
How can I get around that? Do I really have to write a handler to read the file and serve it to the browser myself? Or can I fix this with configuration or somesuch?
PS. I'm using IIS7 if that helps.
Unfortunately I'm afraid you'll have to write a handler that will open the file, and return it to the client.
I've written an IIS Manager extension that displays server log files, and what I've noticed that even the simple
System.IO.File.OpenRead("")
can still run in the same problem, and return the same error.. It was kind of confusing.
In the end I used
System.IO.File.Open("", FileMode.Open, FileAccess.Read, FileShare.ReadWrite)
and I could easily open the file while the server was writing logs to it :)
I think the virtual directory is an "okay" solution, if you add the directory (application) with READ ONLY rights + perhaps "BROWSE directory" too (so you can see the folder contents rendered by the IIS).
(But once you do that, you should consider that you also anonymous access to that folder - unless you enable authentication, so watch out for "secret" contents of the logfiles that you might expose? just a thought.)
Another approach, I prefer myself, is to make a MVC/ASP.NET page that does the lookup in the folder by normal code, so that you 100% can filter whatever data is shown in the HTML.
You can open the files as TextStream's and in Read Only mode.
If it's a problem to gain access to the logfolder, I would use the virtual directory with READ ONLY access and then program something that renders the logfiles as HTML on my screen and with my detail levels. Perhaps even add some sort of "login" first. But it all depends on your security levels and contents of logfiles.
is this meaningfull to you? if not, please explain more, as I've been through this thought a few times already for similar situations.