Rails and Node in the same app on Heroku? - ruby-on-rails

I'm building a Rails application that deals with file uploads through CarrierWave. Currently, larger file uploads block the server for a significant amount of time. I have seen solutions like the s3-swf-upload-plugin gem that skip the local server and send files straight from the browser to S3, but this would require some modifications for pre-generating unique filenames and synchronizing them with the database. I'm sure it wouldn't be too much trouble, but Heroku's new Cedar stack gave me the idea of offloading these long running requests to a node.js instance running in the same app. I'm not very experienced with these kinds of things, so excuse my wording if it's a bit off.
Would something like this be possible? How would you configure things such that certain requests (ones involving file uploads, in this case) would be handled by a node app bundled in the same heroku repository as the main rails app?

I don't think it's possible to mix Rails and Node in the same app. However, you could get roughly the same functionality by using two separate apps that communicate with each other.
You can use ENV['DATABASE_URL'] to determine your database connection string. Use the heroku console to set it as an ENV variable for your Node app (e.g. heroku config:add OTHER_DB=your_connection_string) should then be able to use the same connection string to connect to the same database from your other heroku app. You could even access it outside of heroku if you have a dedicated database, see: http://devcenter.heroku.com/articles/external-database-access
For seamless integration between the two apps, you could have a form rendered by the Rails app post to a URL of the Node app. In addition to the file upload, include in that form via hidden input fields any other variables you need to communicate to the Node app. When the upload to the Node app is done, it could redirect the client back to the Rails app, passing any status or variables as get parameters.
Run the two apps under two subdomains of the same domain and you could even share cookies between them.

You need two apps. I am doing exactly what's described in this question. I wanted large streaming uploads, and since Rack writes downloads to a temp file before passing them through to the handler, it is not possible to do this with Rails.
Node.js, on the other hand, does this beautifully. So there are two Heroku apps, the Rails web app and the Node.js (Express) web app. The Rails web app uses SWFUpload as the client-side solution. The Rails app and the Node.js app both have a secret key as a Heroku config variable. When it's time for the user to upload, client-side Javascript requests an upload URL from the Rails server. The Rails server forms an upload URL with an Expires parameter and computes a signature using the secret key. The client-side Javascript handler passes this URL along to SWFUpload (upload_url property). The user selects the files to upload, and SWFUpload starts posting them to the upload_url. The Node.js app verifies that the URL is not expired and that the signature is valid. It processes the form data with the formidable library.
One other detail. Flash requires the Node.js app to serve a crossdomain.xml that permits the cross-site request.
My Node.js app doesn't touch the database; but if it did I would share DATABASE_URL as previously suggested. Note that you can't share a DATABASE_URL outside of Heroku unless you have a dedicated DB. The DATABASE_URLs for shared databases are not reachable from outside Heroku (unlike some other services like RedisToGo).

Related

Migrating to Cedar on Heroku and losing URL from heroku.com to herokuapp.com

I'm in the process of doing several major improvements to my main production app on Heroku
This includes : using PostgreSQL in development, upgrading to the latest Rails, move to a dedicated database with Crane, and using thin as a webserver. The "last" thing I wanted to do as a logical step was to upgrade my app to Cedar stack instead of Bamboo. I've followed most of the instructions and have an ok "clone" app.
I'd like to move forward and use this new cedar app instead of the bamboo one.
The problem is that this app main use is as a backend serving API requests to an iOS app. These requests are in the format : xxx.heroku.com/...
It was probably a bad idea to use this url in the first place, but this is it and can't be changed on all our current iOS users.
I can find a way to rename my cedar app xxx. The problem is that it will be xxx.herokuapp.com . I know there is an automatic redirect on heroku, but it seems to be only for GET requests. So all my API requests won't return the XML responses I need for the iOS app.
Any suggestions ? I thought I would be able to use xxx.heroku.app going forward and I'm a bit stuck now.
Simple thing is to replace your bamboo app with a simple app that redirects requests to your new URL.
I would suggest never using the heroku domains as production domains - always put your own domain in front of them.

How to post images directly to s3 on a heroku app from a json request?

I have a rails app hosted on heroku and a mobile app made with rhodes.
I'd like to send images from the mobile app to my rails app using an HTTP POST request. Since heroku doesn't allow you to store files, I'm using amazon s3.
I can't send the file from heroku to s3 because it takes more than 30 seconds and causes a timeout. I've seen plenty of examples of uploading a file direct to s3 when the user has a form, but this obviously won't work in this case.
I tried using the suggestion here:
rails 3, heroku, aws-s3, simply trying to upload a file to S3 that is POSTed (http/multipart) to our app
but I still get a 503 request timeout.
I don't want to put my amazon s3 keys on the app.
Right now, I feel like my only option is to host my app on EC2 which I would rather not do as I like the simplicity of Heroku.
Also, it seems strange that these uploads would take so long regardless. I'm only posting images from a mobile phone camera, so they're not huge files.
I was getting the same error in a project in my job. Some people says that the only way to solve this is by uploading files directly to the S3 bucket. This is difficult in our case, because we are using Paperclip Gem for Rails and different size versions of the image.
Some other people says that "The Heroku timeout is a set in stone thing that you need to work around. Direct upload to S3 is the only option, with some sort of post-upload processing required", so I recomend to do the next:
Maybe this is not a solution but, it could be very useful, it was for me in a Rails App:
Worker Dynos, Background Jobs and Queueing
Perhaps you should move this heavy lifting into a background job which can run asynchronously from your web request.
Regards!
So I finally figured out how to do this.
After lots of back and forth with AWS reps and Cloudfiles reps and pulling my hair out, I realized it would be a lot less work to just get another rails server that could write to the filesystem.
So, I started another rails app on openshift. It's just as easy as Heroku to get started (in fact, I might consider moving my rails app there, but it's too new for my taste right now and doesn't have the community around it that Heroku does).
Then, I just had to have communications between my two rails apps.
I know it's not the best/scalable/elegant fix, but it got the job done, and that's what matters in the end!

Ruby on Rails deployment, on "thin" server with lot of attachments

A lot of PDFs are stored inside MySQL as a BLOB field for each PDF file. The average file size is 500K each.
The Rails app will stream the :binary data as file downloads, where there is a user click on the download link.
Assume there is a maximum of 5 users downloading 5 PDFs concurrently, what kind of deployment setup parameters I should be aware of? e.g. for the case of thin:
thin start --servers 3
whether --servers 3 is good enough (or 5 or more is needed) for the above example?
The 2nd question is whether 'thin' a capable solution?
Thanks!
Firstly I don't think you should be storing files in a database. A better place would be in the file system, or alternatively in cloud storage like S3. If you were to use an attachment plugin like paperclip this is a very easy to setup.
However, lets assume you want to store your files in your database.
The problem with your current set up is that when you're sending your file your thin instance is blocking while your client downloads the data. This means if you have 3 thin instances and 3 people downloading pdfs then you're site will not respond to any requests.
Thankfully there is a solution to this problem which involves the x-sendfile header. The way this works is your thin instance sends the file to your webserver, for example nginx, which then serves the file directly.
Here's a great post on stackoverflow on how to set this up with nginx.
Which web server are you using?
You can define one or two thin dedicated to your download. In your webserver you can made a different proxy distribution in your Rails Application and in your download url

mod_xsendfile alternatives for a shared hosting service without it

I'm trying to log download statistics for .pdfs and .zips (5-25MB) in a rails app that I'm currently developing and I just hit a brick wall; I found out our shared hosting provider doesn't support mod_xsendfile. The sources I've read state that without this, multiple downloads could potentially cause a DoS issue—something I'm definitely trying to avoid. I'm wondering if there are any alternatives to this method of serving files through rails?
Well, how sensitive are the files you're storing?
If you hosted these files somewhere under your app's /public directory, you could just do a meta tag or javascript redirect to the public-facing URL of these files after your users hit some sort of controller action that will update your download statistics.
In this case, your users would probably need to get one of those "Your download should commence in a few moments" pages before the browser would start the file download.
Under this scenario, your Rails application won't be streaming the file out, your web server will, which will give you the same effect as xsendfile. On the other hand, this won't work very well if you need to control access to those downloadable files.

can Amazon be used to offload server of static files for a Ruby on Rails app, but still support the app's authentication & authorization?

Can one of the Amazon services (their S3 data service, or otherwise) be used to offload server of static files for a Ruby on Rails app, but still support the app's authentication & authorization?
That is such that when the user browser downloaded the initial HTML for one page of the Ruby on Rails application, when it went back for static content (e.g. an image or CSS file), that this request would be:
(a) routed directly to the Amazon service (no RoR cycles used to serve it, or bandwidth), BUT
(b) the browser request for this item (e.g. an image) would still have to go through an authentication/authorization layer based on the user model in the Ruby on Rails application - in other words to ensure not just anyone could get the image...
thanks
The answer is a yes with a but. You can use a feature of S3 that allows you to create links to secure S3 objects that has a small time to live, default is 5 minutes. This will work for any S3 object that is uploaded as private. This means that the browser will only have X seconds or whatever to request the file from S3. Example code from docs for the AWS gem:
S3Object.url_for('beluga_baby.jpg', 'marcel_molina')
You can also specify an expires_in or expires option per file. The bad thing is that you would need to create a helper for your stylesheet, image, and js links to create the proper S3 URLs.
I would recommend that you setup a domain name for your S3 bucket, like "examples3.amazonaws.com" and put all your standard image files and CSS there as public. Then set that as the asset host in your rails config. Then, only use the secure links for static files that really need it.

Resources