Rails: passing large amounts of data between controller actions - ruby-on-rails

I have a controller with three actions: index, iframe, and result. The way it works is the user visits the index action via GET request. This renders a view that includes a form. The form is simply a button that POSTs to result. My result action simply renders a page that includes a jQuery progress bar and an iframe, the content of which is the iframe action. The iframe action does some long-running processing and eventually returns the result to the result view. (The whole reason I need to do this in an iframe is so the result action returns quickly with a progress bar so the user doesn't think the application crashed.)
Previously the form consisted solely of a button that POSTed to result. In this scenario, the iframe action downloads a ~100MB file from a static URL and does some processing on it, then updates the parent page (result) with the result of the processing.
Now I need to provide the option of uploading a file to process instead of always using the static URL to download from. Basically, if a user provides a file, use that file; otherwise, use the static URL. I have modified my form to accept a file upload and this part is working fine. My problem is how to pass this uploaded file, which is ~100MB, from result to iframe. It is far too big to put in the session. The uploaded file does not need to be saved between runs.

In addition to the implementation detail, you should update your question with a bit more of the background of the problem - what kind of files are you uploading? What kind of processing do you do? What is the purpose of the system from an end-users perspective? It'll help people understand the problem you're trying to solve.
I'm making some assumptions about the purpose of your system here, but here's what I'd do:
I'd upload files and store them in the file system, and also create a new record in a database table indicating the location of the file within the system, and wether the file is "processed" or not. You can use the Carrier Wave gem for this: https://github.com/carrierwaveuploader/carrierwave
I'd then use some kind of background job tool to do the actual processing. So, when a file is uploaded, you add it to the 'queue of stuff to be processed', then return a message to the user saying "the job has been added to the queue for processing and will be finished soon". The ID of the row you saved in the database table can serve as the identifier for which file needs to be processed. Once it's processed, you update the "processed" column in the database to be true. There are several gems to choose from when it comes to background jobs - one popular one is SideKiq - http://sidekiq.org/
At that point, it's basically up to you in terms of how smooth you want to make the process from your end user's perspective. A simplest case would be:
When you return the message "the job has been added to the queue for processing and will be finished soon," you could also say "It should be complete within a few mintues. Refresh this page to see if your job is complete". Each refresh, if the processing is complete, you'd let them know.
A more sophisticated way would be: Once the "job has been added to the queue" page is shown, you could display a timer counting how long it's been, and use a javascript timer to regularly make AJAX requests to the server every couple of seconds to check if the job has been completed. Once the job has been completed, update the page to indicate that, using AJAX.

Related

How to dispaly a holding screen whilst ActiveJob retrieves lots of data from an external API

I have an application that makes API requests to salesforce using restforce.
Specifically the application finds a contact object, returns IDs for all related objects and then pulls the full record for every related object based on their ID.
This takes a long time for two reasons:
There are a lot of request to an external API, usually takes a few fractions of a second for each to reply and for some there can be +500 individual requests.
There is often a large amount of data being pulled back via each request.
All requests currently fall within the salesforce rest API limits but I'm getting timeout errors from my development server as it can take 5+ minutes for some of these requests to process.
Rails 4.2 - How best to handle this?
My question is how do I best get rails to handle this?
I can fire the API requests either from the controller (which definitely violates the skinny controllers) or from the view (via helper methods, which seems like a dodgy hack).
Ideally I'd like to get it running in a background job, but i'm unsure if I can just include all the authentication and other methods in a job in the same way I can include helper methods?
Even if I could get it to work in a background job, I'm unsure what best practice might be for the user experience. Ideally I'd like to route them to a page telling them to "hang tight, go get a coffee" with a progress bar, and then auto route them to the final page once the request is complete...
But I'm unsure how to generate a temporary display until a job has been completed?
Could anyone recommend any gems or strategies that might help me digest this problem?
You should definitely use a background job for this.
Give a database object to the job, which it will update to signal that is has finished, and maybe from time to time to indicate progress.
On the user side, simply tell them that the background job is working, with eventually a progress indicator, and display the result once the database object giving to the job tells you it's ready.

detecting a change in the page including refresh

so i am working in a .tpl file meaning i am open to js, html and php answers. what i want to do is whenever a person refreshes the page, experience a change in the url or exits the browser, my site would take an action based on this change of state. so basically, when they leave that specific page of mines in any way, i would call a function. the reason i want this is because i am saving this editable image on my site. but whenever they leave the page, i want the image the created to be autosaved.
this task splits into client-side and server-side parts. At client side you should bind to interesting browser events, triggering some background http requests to some service URLs of your website, this is probably JS. At the server side, you should provide corresponding reaction to these requests, which is probably PHP.
As long as these service URLs are to be called intermittently by various visitors, be sure to keep an eye on what request came from which client's window. PHP sessions should help you.
I'd propose to work this separately, first to get saving machinery working -- just bind everything to explicit big buttons at the page (page close, url change, etc), then replace each button with the binding to exact JS event. Keep in mind differencies among browsers.

Loading page while rails app starts?

I have a rails application on a shared server that also has a decently sized database, which is still growing, behind it. The application takes a long time to start/load the homepage, about 20-30 seconds for me, although some people report waiting up to several minutes.
Is there a way to flash a notice that informs people that the database may take several minutes to load while they are waiting?
It's hard to say based on your question, since we don't know exactly what your home page is showing or how it's displaying it, but assuming you are referring to an AJAX (based on the tag) call that is retrieving something from the database to be displayed on your homepage, there are a few things you can try:
Paginate the items. Is whatever you're loading a long list of items? If so, only retrieve a few at once, and let the user decide if they want to see more.
Load the rest of the page (header, footer, navigation bar, etc), and then place a loading gif spinner in the area where the content is to be loaded. If you use a javascript library like jQuery this is pretty trivial, and there are a ton of tutorials out there for it. Here is a good site for free loading indicators: http://ajaxload.info/. What you'll want to do is make the AJAX call and use your javascript library to set the loading image. Then, in the success callback for your ajax call, hide the spinner and show the content.
Load one item at a time. Make a separate ajax call for each item you're going to load, so that the user sees them coming in. This will probably end up taking longer overall (you're hitting the database more often), but the visual may be a nice psychological hack.
Look at how your database queries are set up. Are you getting everything you need in one find? That's the best way to do it, as every time you have to make another trip to the database you're increasing the wait time.
Other than that the best thing you can do is get better hardware if possible, maybe look into a VPS like linode.com.

How to store user preferences? Cookie becomes bigger

My application (Asp.Net MVC) has great interaction with the user interface (jQuery/js). For example, setting various searches charts, moving the gadgets on the screen and more .. I of course want to keep all data for each user. So that data will be available from any page in the Dumaine and the user will accepts his preferences.
Now I keep all data in a cookie because it did not seem logical asynchronous access to the server each time the user changes something and thet happens a lot.When the user logout from the application I save the cookie to the database.
The Q is how to save the settings back to the db - from the client to the server.
because the are a lot of interactin that I want to record.
example scanrios: closing widget,moving widget,resizing menues, ordering columens..
I want to record that actions. if I will fire ajax saving rutine for each action
ןt will be too cumbersome. Maybe I have no choice..
Maybe I should run an asynchronous saving all of a certain interval seconds.
The problem is the cookie becomes very large. The thought that this huge cookie is attached to each server request makes me feel that my attitude is wrong.
Another problem cookies have size limit. It varies from your browser but I definitely have been close to the border - my cookie easily become 4kb
Is there another solution?
Without knowing your code, have you considered storing the users preferences in a/your database. A UserPreference table with columns for various settings is a possibility.
You could update it via AJAX/JSON if you had a 'Save Preferences' option, or just update it on postback.
EDIT 1: After thinking about it, I think having an explicit 'save preferences' button would be beneficial and practical.
Somewhere on your page, where the use edits the things that generate the cookie, put an button called save, then hook up a jQuery click handler. On click, build a CSV string or another method of storing the preferences for posting back to the server, then use $.post to send it back to an action method in a controller.
Once there, store it in the database somehow (up to you exactly how), then return a JSON array with a success attribute, to denote whether the preference storing was successful.
When the page is loading, get the preferences out of the database and perform you manipulation.
Another solution would be to store the user preferences into the session and write some server side logic (like action filter) that would write those preferences as JSON encoded string on each page (in a script tag towards the end of the markup) making them available to client scripts.

How to Get Results From a Background Process

I am designing a Ruby on Rails application that requests XML feeds, reads them in, and parses them into objects to be used in views. Since the request for the XML feed and subsequent receipt of it can take several seconds from some sources to complete I need a way to offload these tasks from my front-line application tier. I do not want my application servers to take more than a few hundred milliseconds to process a request. Currently the application serving processes sit and wait for the XML feed data to be returned so they can parse it and finish return the user's request. I am aware of DelayedJobs, however given that the result of this action is to be returned to the user in real-time I am unsure of how to offload it to a background task and receive the result.
If I offload this task to a background task how does the result get returned to the user loading the page?
One common model for this sort of thing is to use your preferred background job library (you mention DelayedJob, which seems to be a popular one) to offload the task from the request/response cycle, and then set up AJAX polling on the client to update the page with the results once they become available.
You can have your main returned page fire an AJAX request at a second tier of servers that handle the XML retrieval, and return HTML for the section of the page that will contain that information. That way you aren't running any asynchronous jobs (from the server's point of view) and the retrieval won't start until the AJAX request comes in, which will reduce the bandwidth you waste on bots.
This is a standard use of AJAX, so I'm not sure whether I'm missing something in your problem that makes it inappropriate for you.
The most common approach is to use AJAX and DelayedJob here, but it is only an usability improvement - instead of user waiting for 5sec to load the page they get an empty or half-empty page with a spinner for 5 seconds. The only way (in my opinion) to really improve the user experience is to load and process those xml feeds periodically and display to user the cached result.
If you are open to Perl code running on your server, I'd lift a piece of LiveJournal infrastructure: Gearman and TheSchwartz
Sounds like you want Gearman - and it has Ruby client bindings.
(see
http://www.livejournal.com/doc/server/lj.install.workers_setup_install.html )

Resources