So troubleshooting a weird issue that a client has over slow connections. The site has a gallery where images can be uploaded and viewed. The controller that handles the POST first deletes all old images and saves the new images. Some users on slow connections when on job sites POST images it may result in all the images being deleted.
What I believe is happening is because of the slow connection the POSTed images are not making it to the server, but the controller method still executes. The old images are deleted, but since the POST of the images fails the new images are not saved.
Thoughts on how to correct this without creating new pages for specifically adding, changing, deleting images. Is there a way to detect that the method fired and attachments were expected, but they didnt arrive?
Related
This is perhaps a simple one but I have been running around in circles for a few hours trying to work out the best way to do this.
Essentially my app allows a user to create a post entry, which is then saved into core data and then posted to a web service if the Internet is available during this time the posting to the web service is done in a background thread allowing the user to carry on working.
The records are flagged SendToWebService = 1
My issue now is that the user can view a list of the entries they made in the app and select to re post it to the web service if it has not already happened, however this is causing duplicate posts as the previous background thread is still working on posting the entry as it is uploading an image or something big.
Any suggestions on how to best handle this?
Thanks
I would suggest having 3 flags for uploads in your core data object.
0 => upload failed,
1 => currently uploading,
2 => upload complete.
As soon as the user selects to upload the post set the flag to currently uploading, in which case you set the update button to a spinner or something. When it completes, either failed or finished then change the upload button to done or re-upload depending on the flag.
This seems like the obvious answer hope I understood your question correctly.
how about this, set SendToWebService=1 for the post that you are currently sending, if it goes through leave it 1 or delete the entry (depending on your implementation) but for some reason if it fails to post, set your SendToWebService back to 0. so when a post is in progress of being sent, it would appear as if its sent.
But if you want to be more transparent about the functionality, create another Boolean called InProgress or something and then turn it 1 when you are sending a request and do not let user post posts who have InProgress True and you can show which ones are in process of being sent in the UI as well, if it gets posted, turn your SendToWebService=1 , if not then Turn your InProgress again to 0
Hope that helped
In case the user is viewing the list of entries from the database than the easiest way wold be:
When post event happen, save the post in database as sent to server and start the background thread.
When the thread completes the run, check if the upload failed mark the entry in db as not uploaded, if it was with success do nothing.
By saving the state of the upload in db, the state will persist even if the user changes the screen or closes the app.
I have an app with Carrierwave on Heroku. On a page, I have 2 forms: 1 ajax form for uploading a picture and 1 normal form for additional information needed to create the object. Suppose my Carrierwave mount is :picture, every time the ajax form is submitted, the picture is saved temporarily into the public folder and its path is returned as :picture_cache. The second form then uses that to know which picture to be created with the new object on the second request. This works fine for a single dyno.
Different dynos don't know about each other's filesystems. Thus if the request to submit the 2nd form doesn't hit the same dyno as the request of the first form, it can't find the image.
Has anyone tackled this problem?
i use a custom model and store all files, including tmp ones, in mongodb. the uploads are marked as tmp. ones the models is 'saved' i simply remove the 'tmp' flag. in this way all nodes see all images all the time. it's pretty crazy that the carrierwave default is to cache in ./tmp since many multi-node configuration would see this issue (unless the balancer implements session affinity).
here is my model and controller, etc: https://gist.github.com/3161569
you have to do some custom work in the form:
save every file posted, no matter what
relay the posted file id in a hidden field
on save look for a file and/or previously uploaded id
make the model associations
this approach, although it isn't 'magic' also gives the following awesome side effects:
you have one process running jobs in the background to thumbnail the images vs. spinning up image_magick whenever a user hits 'submit' (which is a serious DOS vector, esp on memory limited hosts like heroku)
you can migrate images to s3 in the background, hourly, whatever, and the uploads simply have a new url (in this case the controller need to issue a permanant redirect if it notices this). this is really nice because you can keep 'em in the db for dev, staging, etc. and migrate some, or all, uploads onto s3 whenever without changing any upload or view code.
I cant use sessions.
So heres the scenario: I want the user to upload an image, but that image needs to be a particular size. So I allow them to upload any size image, store that temporarily on the server (resize it so it fits on the webpage), display it back to the user, let the user crop it. I then send the crop details back to the server, crop the image and save it and use it as the users profile picture.
I tried to do all this before uploading, but apparently, its a security risk and not allowed.
So how do I temporarily store this file? What if the user does not come back before cropping, I dont want a large image like that sitting on my server. How would I go about removing the file in a stateless application like this?
Files are stored on a CDN.
There are lots of ways to solve this, but perhaps an easy way is that every time a file is uploaded, call a little routine that checks for, and deletes, any 'large' files that are over xxx minutes old.
Alternatively, schedule a job to do the same every xxx minutes in the task scheduler.
You can use TempData, which is similar to Session, but dies after being read.
I have a head aching problem that I can't seem to find an easy solution to.
I have a couple of models, each with an image attachment, that belongs to a user. I have made a very nice ajax file upload and image cropping form, but there is a problem. Everything works fine when I am editing objects that is already in the database but when I upload a file as I create a new object it doesn't. The thing is, to be able to upload and save the image, the object already has to be in the database. I have found two possible solutions to this problem but none of them will work properly.
The first one is to create the object in the database in the new action and redirect to edit action. The pros is that it is a very simple fix. The cons is that the objects will show up in the list with previously created ones even if the user canceled or never submitted the form, which is very confusing.
The second possible solution is to lift out the attachment fields of the model to a separate model. On creation I would then only need to create an attachment object. If the user canceled it would leave the attachment orphaned, but that is probably okay as the orphans could be cleared periodically. The problem with this is that I can't find a way to prevent users from hijacking the orphaned images, or any other image for that sake. Unless I cant solve this problem I'm stuck.
I'm all out of ideas and would really need some help on this one.
Thanks, godisemo
EDIT:
I was probably unclear. In my form it is possible to upload an image. The image is uploaded instantly to the server with javascript, before the form is submitted. The reason is that I want to allow the users to crop the image. This is no problem wen working with existing objects but it is when creating new ones, as I tried to explain earlier.
I've never had to have the model already in the db for paperclip to work.
One thing you can try though is the following. I don't know what your model is called but let's say User has an image. Make your new form so that all of your user fields are passed in the params[:user] var, but then make your image upload file separate from params[:user], say params[:my_image].
Then in your controller validate and save the user, then after user.save, attach the image.
I solved the problem now, with a completly different approach. Instead of thinking databases, objects and models I solved it using file system and temporary files. When the image is uploaded it is processed by paperclip, I then move the generated images to a folder where I have control over them.
I based my solution on a really great article which you can find here, http://ryantownsend.co.uk/articles/storing-paperclip-file-uploads-when-validation-fails.html
I am using paperclip to save photos to my users' profiles. Rails is saving the paperclip attachments multiple times every time a page loads. Any ideas on why this is happening?
It looks like User object is updated every time the page loads. This can happen if you are recording last_login_time for the user. The paperclip does not save the file every time. It prints the trace messages during save. Even though it is annoying, it is mostly harmless.