Best practice to add static content to be served by NGINX ? - post

I have found extensive documentation on how to serve static content with NGINX but I haven't found a good source of information with best practices to have the content (files in the file system) added.
Is POSTing acceptable? or NGNIX by design delegates the actual population of content to another process?

Nginx is best as a proxy server. You could add functionality to do much more in nginx, including posting files, with openresty.
Generally, send the post request to another process to handle it, such as nodejs.
If I misunderstood, and you just want to get your local files into the server directories that nginx is using, check out rsync

Related

Scalable way to share cached files across frontend servers

I have multiple backend servers continuously building and refresing the public parts of an api in order to cache it. The backend servers are builing depending on what has to be done in the job queue.
At a time,
backend server 1 will build :
/article/1.json
/article/5.json
backend server 2 will build :
/article/3.json
/article/9.json
/article/6.json
I need to serve these files from the front-end servers. The cache is stored as file in order to be directly served by nginx without going through the rails stack.
The issue is to manage to have the cache up to date on the front-end servers in a scalable way (adding new servers should be seamless).
I've considered :
NFS / S3 (but too slow)
Memcached (but can't serve directly from nginx - might be wrong ?)
CouchDB direcly serving JSON (I feel this is too big for the job)
Backend to write json in redis, job in fronted to re-write files at the good place (currently my favorite option)
Any experience or great idea on a better way to achieve this ?
You don't say how long it takes to build a single article, but assuming it's not horrifically slow, I think you'd be better off letting the app servers build the pages on the fly and having the front end servers doing the caching. In this scenerio you could put some combination of haproxy/varnish/squid/nginx in front of your app servers and let them do the balancing/caching for you.
You could do the same thing I suppose if you continued to build them continuously on the backend.
You're end goal is to have this:
internet -> load balancer -> caching server 1 --> numerous app servers
\-> caching server 2 -/
Add more caching servers and app servers as needed. The internet will never know. Depending on what software you pick the load balancer/caching server might be the same, or might not. Really depends on your load and particular needs.
If you don't want to hit the rails stack, you catch the request with something like rack-cache before it ever reaches the whole app:
http://rtomayko.github.io/rack-cache/
At least that way, you only have to bootstrap rack.
It also supports memcached as a storage mechanism: http://rtomayko.github.io/rack-cache/storage
You are right, S3 is pretty slow by itlsef, especially HTTPS session setup can take up to 5-10 seconds. But S3 is the ideal storage for primary data, we use it a lot but with combination of S3 Nginx proxy to speed data delivery up and inject caching facilities.
Nginx S3 proxy solution well tested on production and the caching mechanism works perfect, every application server goes to proxy that fetches original file from S3 to be cached.
To prevent dog-pile effect you can use:
proxy_cache_lock for new files, doc
proxy_cache_use_stale updating for updated files, doc
An S3 Nginx proxy configuration look at this https://gist.github.com/mikhailov/9639593

Best way to serve files?

I'm a novice web developer with some background in programming (mostly Python).
I'm looking for some basic advice on choosing the right technology.
I need to serve files over the internet (mp3's), but I need to implement some
control on the access:
1. Files will be accessible only for authorized users.
2. I need to keep track on how many times a file was loaded, by whom, etc.
What might be the best technology to implement this? That is, should I
learn Apache, or maybe Django? or maybe something else?
I'm looking for a 'pointer' in the right direction.
Thank!
R
If you need to track/control the downloads that suggests that the MP3 urls need to be routed through a Rails controller. Very doable. At that point you can run your checks, track your stats, and send the file back.
If it's a lot of MP3's, you would like to not have Rails do the actual sending of the MP3 data as it's a waste of it's time and ties up an instance. Look into xsendfile where Rails can send a response header indicating the file path to send and apache will intercept it and do the actual sending.
https://tn123.org/mod_xsendfile/
http://rack.rubyforge.org/doc/classes/Rack/Sendfile.html
You could use Django and Lighttpd as a web server. With Lighttpd you can use mod_secdownload, wich enables you to generate one time only urls.
More info can be found here: http://redmine.lighttpd.net/projects/1/wiki/Docs_ModSecDownload
You can check for permissions in your Django (or any other) app and then redirect the user to this disposable URL if he passed the permission check.

Serving files through controllers with partial download support

I need to serve files through grails, only users with permission have access, so I cant serve them with a static link to a container. The system is able to stream binary files to the client without problems,but now (for bandwidth performance issues on the client) I need to implement segmented or partial downloads in the controllers.
Theres a plugin or proven solution to this problem?
May be some kind of tomcat/apache plugin to restrict access to files with certain rules or temporal tickets so I can delegate the "resume download" or "segmented download" problem to the container.
Also i need to log and save stats on the downloads of the users.
I need good performance so, I think doing this in the controller is not good idea.
Sorry bad english.
There is a plugin for apache - https://tn123.org/mod_xsendfile/ It doesn't matter what you're using behind apache at this case. By using this plugin you will respond with special header X-SENDFILE, with path to file to serve, and Apache will take care about actual file downloading for current request.
If you're using Nginx, you have to use X-Accel-Redirect header, see http://wiki.nginx.org/XSendfile

Is it possible to use modify nginx config file and use X-Accel-Redirect on Heroku?

Reading this article on nginx website, I'm interested in using X-Accel-Redirect header in the way that Apache or Lighttpd users might use the X-Sendfile header to help with the serving of large files.
Most tutorials I've found require you to modify the nginx config file.
Can I modify the nginx config file on Heroku and if so, how?
Secondly,
I found this X-Accel-Redirect plugin on github which looks like it removes the need to manually alter the nginx config file - it seems to let you add the redirect location in your controller code - does anyone know if this works on heroku? I can't test it out until tonight.
NB - I have emailed both Heroku support and goncalossilva to ask them the same questions but I have no idea when they will get back to me. I will post back with whatever it is they tell me though.
Although Heroku seem to be using Nginx for their reverse-proxy component, the thing about a platform-as-a-service stack like this is that no individual tenant has to (nor even gets to) configure or tune distinct elements of the stack for any given application.
Requests in and out could be routed through any number of different elements to and from your Rails app so it's their platform infrastructure (and not any particular tenant) that manages all of the internal configuration and behavior. You give up the fine-grained control for the other conveniences offered by a PaaS such as this.
If you really need what you've described then I'd suggest you might need to look elsewhere for Rails app hosting. I'd be surprised if their answer would be anything else but no.

A route to serve static assets (like .jpgs, etc?)

I've worked my way through a number of interesting routing problems - turning a request URL into a hash, etc., but just out of curiosity, is there a way to tell the routing system that you want anything that comes under a certain url subpath to be served literally - without going through a controller?
For instance, if I have /home/me/public_html/rails_proj/images/foo.jpg, and .../rails_proj/images/other/bar.jpg, can I insert a route that says "anything under images should just be served as an object of the default mime type?"
Might be interesting.
If you put the "images" directory into the "public" folder of the Rails app (for example: /public/images/) then you shouldn't have any problems with MIME types unless your web server is configured wrongly.
According to your examples, you want the images dir in the root of the app. I don't think there is a way though Rails to make those images visible, but if you really wanted to you could use mod_rewrite to make it work. Once again, it would be up to the web server to make sure the images had the correct MIME type.
Things that are served out of the public directory will not go through Rails - they'll just be handled by your server (probably apache). The only reason why you would need to serve images through the rails system is if you wanted some kind of control on who could access them. Just put everything else in public and access ala:
siteurl.whatever/images/*.jpg
I typically use nginx as a frontend and Apache/Passenger as a backend. Ngingx proxies all Rails requests to Apache but handles all static content itself. Check out the examples on the English nginx wiki. Here is a small excerpt for nginx config:
server {
listen 80;
server_name www.domain.com;
location ~* \.(jpg|jpeg|gif|png|ico|css|bmp|js)$ {
root /path/to/static/assets/dir;
}
location / {
proxy_pass http://127.0.0.1:81;
}
}
So have apache listen on port 81 to handle Rails requests proxied by nginx and let nginx deliver static content. Not only is nginx supposedly faster than Apache at delivering static content, but this also offloads your Rails application for every image, stylesheet, javascript or whatever other static content.
I think the simplest way solve this problem is just using image_path helper method, which provide you the path for the image you want to display in the view.
For example, if you want to refer a logo.png under the /assets/images/logo.png, you can just use image_path('logo.png').
Caveat: if your request URL matches a static resource WEBrick, mongrel or whatever will happily serve it. At any cost, you don't want this in production: if your traffic is high enough your app will be brought to its knees just because its mongrels will be busy serving static content.
So make sure that your web server is properly configured for all kinds of static content as the previous commenters have pointed out.

Resources