Is there a GEM to make a rails app work like an FTP - ruby-on-rails

I have an app that stores products that get loaded up to other sites. Typically I get the product data via REST in the form of CSV files. These get stored in TMP, parsed and imported.
I however have a client who now wants to send me files via FTP, but at the same time doesn't want to use an external FTP.
So. I'm wondering if there's a GEM to make my rails app respond to FTP commands. Something that could present a table the same way that FTP presents files.
Yes I know I should go back and say "ha haa Haaaa!", but mine is not to reason why, mine is to do, or die.
I could likely role my own, but if there's a GEM someone knows of that would be most helpful. Thanks

There maybe (but I doubt it). However, Heroku will only accept incoming requests on Port 80 or if you're using an SSL-Endpoint that will accept requests on port 443. So with Heroku, if you even found a gem it would be impossible to use FTP since that relies on port 20/21 as a standard.
I've done something similar but not using FTP, rather having the client place files on Amazon S3 and then have my Heroku app list contents of a folder on S3.

Related

How to use FTP via a proxy in Rails?

There is an FTP server that I can connect to on my development machine using FileZilla or the Rails app I'm working on. But as soon as I deploy the app to Heroku, the exact same connection parameters time out. My best guess is that the server blocks IP ranges that include Heroku, or dynamic IPs in general. It is not a configuration problem because the deployed app can connect to other FTP servers without issue.
To get around this problem, I'm trying to use a QuotaGuard static URL as a proxy, the add-on for which I've already provisioned and have an ENV variable for. The problem is that this static URL is in the form http://username:password#subdomain.domain.com:9293.
How can I use this to handle an FTP connection?
Current code (works locally, times out on Heroku):
Net::FTP.open(host, username, password) do |ftp|
ftp.chdir(some_directory)
# some logic here about which files to download
end
I've checked the Ruby docs for Net::FTP and Net::HTTP for more information. FTP only seems able to use a SOCKS proxy, but HTTP seems more flexible. Could I use the static URL as a SOCKS proxy by ignoring the http:// prefix? Could I restructure the logic so that I can GET each FTP URL I need via HTTP?
I've also looked into using ProxyChainRB to do this but so far not having any luck since I'm running into the same issue of passing the proxy into an FTP connection.
Are there existing libraries that do this? Is there maybe a simpler solution I'm not seeing here?

Are any HTTP log files saved on my system by default?

I have an application hosted on Amazon EC2 on a Ubuntu machine, written in Ruby (on Rails), deployed with Capistrano and running on Nginx.
Last friday one module of my application has crashed and nobody in the company noticed until this morning. We spent some money with Facebook and Google ads and received a few hundreds of visits, but nobody created an account due to this bug.
I wonder if this configuration is saving the HTTP requests and its bodies somewhere in a log file. But we didnt explicitly set it, so it would only happen if any of these technologies do it by default.
Do you know whether there is such log or not?
Nope, that wouldn't be anywhere in a usable form (I'm inferring you want to try to create the accounts from request bodies in log files). You'll have the requests themselves in your nginx logs, and the rails logs will contain more info about the request, but as a matter of security, by default, any sensitive information (e.g. passwords) would be scrubbed from them. You may still be able to get some info from them.
To answer your question a little more specifically, the usual place for these logs on your system would be:
/var/log/nginx/
/path/to/your/rails/app/log/production.log
On a separate note, I would recommend looking into an error reporting service like Honeybadger, Airbrake, Raygun, Appsignal, or others so that you don't have silent failures like this moving forward.

File access control in Rails

I have a web application which allows users to upload files and share them with other people across the internet. Anyone who has access can download the files, but if the uploader doesn't specifically share the file with someone else, that person can't download the files.
Since the user permissions are controlled by rails, each time someone tries to download a file it sent to the user from a rails process. This is a serious bottle neck - rails is needed for the file upload and permissions but it shouldn't be in the way taking up memory just for others to download files.
I would like to split the application on different servers for the frontend, database and file server. If the user does to my site, they should have the ability to download the file directly from something like my-fileserver.domain.com/file/38183 instead of running it through rails.
What is the best option for this? I would like to control file access at the database level, not the file system - but I don't want rails taking up all of the memory on my system for such a simple process. Any ideas?
Edit:
One thing I may be able to do is load a list of files/permissions from mysql into a node.js app and give access rights to the file server as a true/false response based on what the file server sends in. This still requires the file server to run a web server, however.
May be You could generator a rand url for file, and control by center system .

Rails and Node in the same app on Heroku?

I'm building a Rails application that deals with file uploads through CarrierWave. Currently, larger file uploads block the server for a significant amount of time. I have seen solutions like the s3-swf-upload-plugin gem that skip the local server and send files straight from the browser to S3, but this would require some modifications for pre-generating unique filenames and synchronizing them with the database. I'm sure it wouldn't be too much trouble, but Heroku's new Cedar stack gave me the idea of offloading these long running requests to a node.js instance running in the same app. I'm not very experienced with these kinds of things, so excuse my wording if it's a bit off.
Would something like this be possible? How would you configure things such that certain requests (ones involving file uploads, in this case) would be handled by a node app bundled in the same heroku repository as the main rails app?
I don't think it's possible to mix Rails and Node in the same app. However, you could get roughly the same functionality by using two separate apps that communicate with each other.
You can use ENV['DATABASE_URL'] to determine your database connection string. Use the heroku console to set it as an ENV variable for your Node app (e.g. heroku config:add OTHER_DB=your_connection_string) should then be able to use the same connection string to connect to the same database from your other heroku app. You could even access it outside of heroku if you have a dedicated database, see: http://devcenter.heroku.com/articles/external-database-access
For seamless integration between the two apps, you could have a form rendered by the Rails app post to a URL of the Node app. In addition to the file upload, include in that form via hidden input fields any other variables you need to communicate to the Node app. When the upload to the Node app is done, it could redirect the client back to the Rails app, passing any status or variables as get parameters.
Run the two apps under two subdomains of the same domain and you could even share cookies between them.
You need two apps. I am doing exactly what's described in this question. I wanted large streaming uploads, and since Rack writes downloads to a temp file before passing them through to the handler, it is not possible to do this with Rails.
Node.js, on the other hand, does this beautifully. So there are two Heroku apps, the Rails web app and the Node.js (Express) web app. The Rails web app uses SWFUpload as the client-side solution. The Rails app and the Node.js app both have a secret key as a Heroku config variable. When it's time for the user to upload, client-side Javascript requests an upload URL from the Rails server. The Rails server forms an upload URL with an Expires parameter and computes a signature using the secret key. The client-side Javascript handler passes this URL along to SWFUpload (upload_url property). The user selects the files to upload, and SWFUpload starts posting them to the upload_url. The Node.js app verifies that the URL is not expired and that the signature is valid. It processes the form data with the formidable library.
One other detail. Flash requires the Node.js app to serve a crossdomain.xml that permits the cross-site request.
My Node.js app doesn't touch the database; but if it did I would share DATABASE_URL as previously suggested. Note that you can't share a DATABASE_URL outside of Heroku unless you have a dedicated DB. The DATABASE_URLs for shared databases are not reachable from outside Heroku (unlike some other services like RedisToGo).

Ruby on Rails deployment, on "thin" server with lot of attachments

A lot of PDFs are stored inside MySQL as a BLOB field for each PDF file. The average file size is 500K each.
The Rails app will stream the :binary data as file downloads, where there is a user click on the download link.
Assume there is a maximum of 5 users downloading 5 PDFs concurrently, what kind of deployment setup parameters I should be aware of? e.g. for the case of thin:
thin start --servers 3
whether --servers 3 is good enough (or 5 or more is needed) for the above example?
The 2nd question is whether 'thin' a capable solution?
Thanks!
Firstly I don't think you should be storing files in a database. A better place would be in the file system, or alternatively in cloud storage like S3. If you were to use an attachment plugin like paperclip this is a very easy to setup.
However, lets assume you want to store your files in your database.
The problem with your current set up is that when you're sending your file your thin instance is blocking while your client downloads the data. This means if you have 3 thin instances and 3 people downloading pdfs then you're site will not respond to any requests.
Thankfully there is a solution to this problem which involves the x-sendfile header. The way this works is your thin instance sends the file to your webserver, for example nginx, which then serves the file directly.
Here's a great post on stackoverflow on how to set this up with nginx.
Which web server are you using?
You can define one or two thin dedicated to your download. In your webserver you can made a different proxy distribution in your Rails Application and in your download url

Resources