How do I upload to paperclip asynchronously using backburner? - ruby-on-rails

I want to upload a bunch of files into my server using paperclip. Currently it is too slow and hence i want to allocate this to my background workers. I am already using backburner for a bunch of tasks. I tried the following which does not work
upload = UploadedFile.async.create(params[:file])
The async function works will all other normal jobs but does not work with paperclip.
I read that i could use delayed_job . However since i am already using backburner which seems to do the same worker allocation, i ideally want to use that. If it is not possible, is it wise to use both backburner and delayed_job at the same time? Will there be any conflict in worker allocationwhen both are called at the same time for different processes on the server?

I'm not experienced with backburner, but.
Uploading files in background is quite strange requirement.
Just imagine, while you are uploading file - browser keeps connection with server to transfer data.
Processing in background means, that you are immediately break this connection. It does not make sense, because you could not transfer file data in such way.
So uploading is synchronous operation, and can't be done in background.

Related

Long-call asynchronous data delivery for Rails app?

We have a rails app that does some user-driven/filtered data representation over a large dataset. So we're calculating things on the fly and it takes longer than the 15s Unicorn gives us!
What's the best option here? I was thinking of using a pub/sub model (like a Node/Faye setup) to allow the rails app to send data that the browser could then render.
I guess another option is to try to pre-generate the data, but as we have a lot of clients and very few would be looking at the data it seems like we'd be wasting a lot of time on preparing data that would never be used.
You're on the right track with pre-generating the data.
If you're concerned about needless number crunching and want to do it on-demand, you could kick off a background job to process the data, and poll periodically to see if the background generation is done or not.
If you're looking for a library to do this for you:
Alternatively, if you're using ActionCable already, get_schwifty was built for this very purpose (shameless plug, I'm the author).
render_async is another option if you're not using ActionCable, however, I beleive it still does the processing in a Unicorn process instead of a background job.

iOS7 Background Synchronization (with NSURLSessionDataTask?)

Scenario:
As a user I am able to take (an unlimited amount of) photos and videos which are stored in the apps documents folder. Each of these media files gets a record within a Sqlite database with additional information (for exeample a caption). All this is possible to do completely offline.
Back online I get a dialog with a list of all the videos and photos I took and a button which starts an upload process.
Each file is uploaded after the other together with its metadata by making a multipart POST request to the server. The response of the server is stored together with the metadata in the Sqlite database (so there is no fire and forget).
Reliable solutions?
If I am reading and understanding this chart correctly, the most simple solution would be to wrap each of these uploads within a Task. Side effect: after 10 minutes every task would be cancelled, which becomes a problem by having a slow connection or very large files (for example a very long video).
The recommended way would be to use NSUrlSession/Background transfer service.
Which leads me to my question:
Is it possible to wrap multipart POSTs in NSURLSessionDataTasks and would this be reliable, even if the task is running longer than 10 minutes or the user is suspending the app?
As I am a Xamarin/C# guy I would really appreciate some sample snippets for a working multipart upload, even if it's in Objective-C ;-).
Almost and... yes.
Background Transfer service works with NSUrlSessionDownloadTasks and NSUrlSessionUploadTasks only. Not NSUrlSessionDataTasks, as described here.
Other than this "basic" limitation, it's safe to use background transfer service with upload tasks.
The 10-minute-freepass-in-the-background no longer applies on iOS 7 (basically, it's there, but different), however, with NSURLSession and background transfer service you do not need it.
I've a blog post here for background transfer service, based on download tasks.
An important thing to note is that, starting a task basically means that it will actually start sometime and actually finish some other time. This depends on whether the device is on cellular or Wi-Fi and other factors which are (probably) only known to iOS (and Apple).

Rails - uploading big images to Heroku

When I upload an image (on Amazon S3 servers) to a Heroku app from a camera, where the photos have let's say more than 2.5MB, sometimes the app is not processed within 30 seconds and I see on the screen the warning message about Application Error.
This is not very user-friendly behavior, how to avoid this acting? I now I can buy an additional dyno, but I am not sure that this is a solution. For file upload I use Paperclip gem.
How do you solve this situation, when users uploads let's say images bigger than 3MB?
There's a couple things you could do: (in order from best to worst bet)
If you have a need to do a lot of post-procesing on images (like resizing them) you can have all of that processing done on a worker dyno using "Delayed Jobs." This way you get a response back much faster, but your alternate or resized versions of the image aren't immediately available, only :original. There's a tutorial on it here: http://madeofcode.com/posts/42-paperclip-s3-delayed-job-in-rails
You could use Unicorn or one of it's cousins. While it likely won't fix the image upload issue by itself, it allows you to adjust how long it takes for a request to timeout. As an added bonus it should also speed up the rest of your app.
You could try using Carrier Wave with CarrierWaveDirect instead of paperclip. This is kind of a shot in the dark as I've never personally used it, but it's supposedly 'better,' which could mean faster? maybe? It sounds like it works in a similar way as Paperclip with delayed jobs.

Ruby/Rails synchronous job manager

hi
i'm going to set up a rails-website where, after some initial user input, some heavy calculations are done (via c-extension to ruby, will use multithreading). as these calculations are going to consume almost all cpu-time (memory too), there should never be more than one calculation running at a time. also i can't use (asynchronous) background jobs (like with delayed job) as rails has to show the results of that calculation and the site should work without javascript.
so i suppose i need a separate process where all rails instances have to queue their calculation requests und wait for the answer (maybe an error message if the queue is full), kind of a synchronous job manager.
does anyone know if there is a gem/plugin with such functionality?
(nanite seemed pretty cool to me, but seems to be only asynchronous, so the rails instances would not know when the calculation is finished. is that correct?)
another idea is to write my own using distributed ruby (drb), but why invent the wheel again if it already exists?
any help would be appreciated!
EDIT:
because of the tips of zaius i think i will be able to do this asynchronously, so i'm going to try resque.
Ruby has mutexes / semaphores.
http://www.ruby-doc.org/core/classes/Mutex.html
You can use a semaphore to make sure only one resource intensive process is happening at the same time.
http://en.wikipedia.org/wiki/Mutex
http://en.wikipedia.org/wiki/Semaphore_(programming)
However, the idea of blocking a front end process while other tasks finish doesn't seem right to me. If I was doing this, I would use a background worker, and then use a page (or an iframe) with the refresh meta tag to continuously check on the progress.
http://en.wikipedia.org/wiki/Meta_refresh
That way, you can use the same code for both javascript enabled and disabled clients. And your web app threads aren't blocking.
If you have a separate process, then you have a background job... so either you can have it or you can't...
What I have done is have the website write the request params to a database. Then a separate process looks for pending requests in the database - using the daemons gem. It does the work and writes the results back to the database.
The website then polls the database until the results are ready and then displays them.
Although I use javascript to make it do the polling.
If you really cant use javascript, then it seems you need to either do the work in the web request thread or make that thread wait for the background thread to finish.
To make the web request thread wait, just do a loop in it, checking the database until the reply is saved back into it. Once its there, you can then complete the thread.
HTH, chris

Large File Uploads

Do large file uploads block an applications request/response cycle? I have an app that allows users to upload multiple large files (images in particular). These files are stored on a remote host. I cannot use async background jobs to upload these images as these have to be immediately accessible to the user once the upload finishes. How best should i handle such large uploads? Does it affect concurrency? This is my first time with uploads on a large scale. What should i be wary of other than the huge bills of course? Any input from developers who have created apps which use large file uploads will be greatly appreciated.
Why can't you use an async upload, and just handle the event that signifies that it's done? That's generally how async operations work - you kick them off and then store the pointer somewhere, and then either handle the "Complete" event, or just periodically iterate through he pointers for uploads you've started and check each one to see if it's complete.
It's an old question but still, I was worried about the same problem with large files uploads thinking that the processes get blocked while the file is getting uploaded but it turned out, if I got it right, that the nginx and probably other servers as well buffer the content of the file while its being sent so no rails processes get blocked, only when the upload is finished and rails is processing it, like resizing images or something.

Resources