Running Thin Faye instance separate from nginx+Passenger - ruby-on-rails

I have several Rails apps running by Passenger+nginx 1.2.
I need to add some Comet features to one of them, but i don't want to add any tcp_module and rebuild nginx, i just need separate Faye+Thin instance running on subdomain of my app http://faye.myapp.com:9292 (on the same Ubuntu machine) and listening clients directly.
Is there elegant solution of that problem?

Related

Can I keep my database on my local network but deploy my rails app on a hosting service?

I have a rails application that is currently hosted on Heroku. It is used on our local network only, and my boss does not want a 3rd party hosting our data. I convinced IT to set me up a virtual windows server to deploy my app on. However, it has been very difficult to set up for production.
Is there anyway that I can use a hosting service for my application, but have the database reside on our local network?
Or is there an easier way to deploy a rails app on a windows server? I have been looking into using the Linux Subsystem for Windows.
If your app is used on your local network only, why not ditch Heroku and host your Rails app locally as well? What benefit is a scalable cloud hosting provider giving you? Especially since it seems your boss has security concerns about remote hosting of a database. Bringing the entire thing in house may be the best solution.
Simple answer is yes you can, but why would you. It's simpler to run your application locally than connecting your remote app to a local database.
Your best bet is to use a Linux virtual machine instead of Windows, usually there is to much hassle to get rails application to work on windows, especially compiling native gems.
I suggest that you get a CentOS VM, and install Nginx with passenger gem using rbenv or rvm.
Digital Ocean has a nice guide that explains this process in details:
How To Deploy Rails Apps Using Passenger With Nginx on CentOS 6.5

Running multiple Phusion passenger instances for one rails application

is it possible (or a good idea) to run multiple Phusion passenger instances on 1 server (same rails app on many instances).
I wanna use Nginx as Frontend webserver (loadbalancer).
Any best practice?
Thanks
Yes it is possible and quite easy. Just you need to run it on different port.
You can refer this link.
In this, they have ran same application on multiple ruby version.
Yes it is possible and works fine. Just make sure they're assigned on different ports.

How to create a multi-app Ruby on Rails shared environment

I am looking to create a shared hosting environment allowing for multiple RoR apps to be running well isolated from one another (and the underlying os), running different versions of RoR as required.
My question is can this be done without having to resort to OpenVZ/Virtualisation?
If so, would the following approach be suitable - what would be required to make apps well isolated from each other and the OS?
NGinx, single instance for load balancing
Unicorn, multiple instances started by NGinx to handle requests (capable of running different versions of RoR
(Rbenv or RVM) and Bundler allow to isolate gems of different Rails applications.
So there will be no troubles with that.
Each rails app will have its own instance of Unicorn(puma, thin, whatever).
Nginx will have separate domain name based virtual host for each rails app, and will forward requests to upstream(Unicorn/Puma).
Each rails app should have separate database at db server too.
So I don't see any problems with isolating multiple rails apps.
For additional isolation you can use Docker, so each app will be running in separate container.

Running RoR in isolation on Ubuntu VPS

Apologies for these stupid questions (please explain why if you're going to downvote).
I have a site running on a LAMP stack on a Linode Ubuntu VPS and want to learn rails on the remote server without causing disruption to the site currently at mydomain.com.
1) Can I install rails the normal way (as I would on my own PC), and have it not effect the site that is currently up?
2) If a generate an app skeleton after installation, after starting the rails server, how can I navigate the default view?
1) Your test Rails app and the production PHP app can co-exist (hopefully you know your way around Linux) on the same server without interfering with each other. However, I would not recommend this. Bad idea to be trying experimental stuff on production VMs/VPSs. You are better off spooling a test VPS for Rails or better still use VirtualBox VMs on your local machine.
2) Rails apps start on port 3000 by default. So on the VPS, you can reach the Rails app root at http://xxx.xxx.xxx.xxx:3000 (replace the x's with your VPS's IP addr.)

What can I use to host a Rails site on Windows?

Okay, before you guys go nuts -- this is just a small site, temporary setup. Right now I'm having some internal folks remote into the server and use the site through webrick via the dev command: ruby script/server. Not exactly ideal.
I'm just starting Rails dev and I want to know a better way to handle hosting on a Windows Pro box. Again, just temporary so please be gentle :)
As far as I know, mod_rails isn't an alternative.
Mongrel plays very nice on windows, though, so you can set up a few mongrels and have IIS or Apache proxy to them. Or just use Mongrel directly. Before mod_rails, mongrel was the de facto way to deploy on any platform, so it's a very viable choice.
The one time I was forced to deploy on Windows, however, I installed Ubuntu via virtualbox (could also use VMWare or whatever, of course) and deployed on that. Works like a dream, and I got to work with a sensible OS. Phew. SSH and stuff. Can't live without it. Remote desktop isn't exactly my kind of thing.
Your best bet is to setup a Mongrel cluster. Mongrel is an application server which can serve a Rails application on HTTP. But a single Mongrel instance can only handle 1 request at the same time, so typically people run a cluster of Mongrels, i.e. multiple Mongrel instances. These Mongrel instances do not talk to the Internet directly. Instead, they are put behind a load balancer or a web server, which proxies requests to this cluster of Mongrels. If you use Apache on Windows then you can:
1. Setup and start a cluster of Mongrels, each listening on its own port.
2. Setup a virtual host with some mod_proxy_balancer directives, with which you tell mod_proxy_balancer to proxy all requests to the Mongrel cluster. mod_proxy_balancer will automatically distribute the load between the Mongrels.
If usage is really low i.e. likely to be mostly 1 person at a time or your response time is really low then you can get away with a single mongrel and having your users point there browser to the relevent IP address and port.
For some time before I finally got my apps migrated to our corporate Linux/apache "cloud" (which was anything but straightforward, for mostly internal IT-related reasons) I ran two apps on a workstation, using a separate mongrel (different ports) for each. It worked well enough to be useful for almost a year.
These days (well, about three weeks now) I've substituted the immediacy, control - and vulnerability - of local (under my desk) access for the stability of five servers, each with multiple mongrels, staging areas, and deployment annoyances. Swings and roundabouts.

Resources