Paperclip running multiple times on each page - ruby-on-rails

I am using paperclip to save photos to my users' profiles. Rails is saving the paperclip attachments multiple times every time a page loads. Any ideas on why this is happening?

It looks like User object is updated every time the page loads. This can happen if you are recording last_login_time for the user. The paperclip does not save the file every time. It prints the trace messages during save. Even though it is annoying, it is mostly harmless.

Related

Is there a way to only have carrierwave run when certain attributes are added?

I have a song model that has genre taggings using acts-as-taggable-on and it is very slow and causing timeouts on saving/updating. Without tags, I can save a record in about 2 seconds, but with tags its averaging around 17 seconds. The problem I have is when is when creating a song, I am also uploading a music file and am using carrierwave to process and encode/add tags to the file itself so this has to all be done at once.
Right now my idea is to create the song record, and move all tag saving to a delayed job, and then have carrierwave do its thing. So far I'm not seeing anything in the docs that would allow me to. My other option is to have carrierwave process the file twice using recreate_versions! but I'd like to avoid having to do that if possible.

File Attachments timeout and controller method execution

So troubleshooting a weird issue that a client has over slow connections. The site has a gallery where images can be uploaded and viewed. The controller that handles the POST first deletes all old images and saves the new images. Some users on slow connections when on job sites POST images it may result in all the images being deleted.
What I believe is happening is because of the slow connection the POSTed images are not making it to the server, but the controller method still executes. The old images are deleted, but since the POST of the images fails the new images are not saved.
Thoughts on how to correct this without creating new pages for specifically adding, changing, deleting images. Is there a way to detect that the method fired and attachments were expected, but they didnt arrive?

Conditional/delayed mailing with resque?

I got an application that creates loads of PDF documents which, sometimes take some time to create, hence I moved all the PDF creation to a resque background job. However, some PDFs need also to be sent out by mail which now is a problem because I don't know how to tell the mailer to wait for the PDFs to be created.
Before I had this:
#contract.create_invoice
ContractMailer.send_invoice(#contract).deliver
Now I have this:
Resque.enqueue(InvoiceCreator, #contract)
ContractMailer.send_invoice(#contract).deliver
So ContractMailer always fails because the pdf is not yet created. Anyone has an idea how to solve this elegantly?
Thanks!

How should I indicate a long request to the user in Rails?

I have an rails application with a huge database (hundreds of gigabytes) that has a lot of different options what to do with the data. In some cases, like changing data, this can be done in a background task I do with Sidekiq. But in other cases, like viewing data with a lot of rows and columns or complex SQL queries, the process of getting the data takes quite long.
What I want to do, is show the user that something is happening when he clicks a link. Like the progress bar many browsers have, but more obvious, so even users not used to working with browsers should see that something is happening and loading.
The question is how to do this. I already tried different options with AJAX and jQuery but most of the times I can only do this for certain actions, but basically I want to do it for the whole application. So every time the user sends a request to load a new page, I want to immediately show him, that something is happening.
The closest I came was a Javascript, that was always triggered. The problem was that it literally happened every time and forced to reload the page. This means when I toggled an element it showed the progress bar and then reloaded the page.
In essence, what I'm looking for:
My application runs on Ruby and Rails 4 and every time a new page is loaded I want the user to show that his request is being processed, so even if the request takes a couple of seconds, the user won't get nervous because he knows that something is happening.
I would really appreciate any help for finding a solution, because I can't seem to find any...
You should use a javascript animation to show to the user that something is happening.
For example : http://ricostacruz.com/nprogress/

How to use carrierwave cache on Heroku multiple dynos?

I have an app with Carrierwave on Heroku. On a page, I have 2 forms: 1 ajax form for uploading a picture and 1 normal form for additional information needed to create the object. Suppose my Carrierwave mount is :picture, every time the ajax form is submitted, the picture is saved temporarily into the public folder and its path is returned as :picture_cache. The second form then uses that to know which picture to be created with the new object on the second request. This works fine for a single dyno.
Different dynos don't know about each other's filesystems. Thus if the request to submit the 2nd form doesn't hit the same dyno as the request of the first form, it can't find the image.
Has anyone tackled this problem?
i use a custom model and store all files, including tmp ones, in mongodb. the uploads are marked as tmp. ones the models is 'saved' i simply remove the 'tmp' flag. in this way all nodes see all images all the time. it's pretty crazy that the carrierwave default is to cache in ./tmp since many multi-node configuration would see this issue (unless the balancer implements session affinity).
here is my model and controller, etc: https://gist.github.com/3161569
you have to do some custom work in the form:
save every file posted, no matter what
relay the posted file id in a hidden field
on save look for a file and/or previously uploaded id
make the model associations
this approach, although it isn't 'magic' also gives the following awesome side effects:
you have one process running jobs in the background to thumbnail the images vs. spinning up image_magick whenever a user hits 'submit' (which is a serious DOS vector, esp on memory limited hosts like heroku)
you can migrate images to s3 in the background, hourly, whatever, and the uploads simply have a new url (in this case the controller need to issue a permanant redirect if it notices this). this is really nice because you can keep 'em in the db for dev, staging, etc. and migrate some, or all, uploads onto s3 whenever without changing any upload or view code.

Resources