Let's say my front-end makes a ton of AJAX calls to the ruby backend for JSON objects.
This is open ended, but from your experiences, which is the better option? I'm concerned with 1. performance, and 2. style
Option 1: Make one AJAX request to backend that returns a lengthy JSON string
Option 2: Break down the request into several AJAX requests that each returns a shorter JSON string.
Or an Option 3? I'll take into consideration other alternatives.
Thanks!
I think if you are making frequent ajax request than you can try web sockets as it is very widely used for such applications these days. Here is a link that makes a very good comparison between different technologies used for such requirements: In what situations would AJAX long/short polling be preferred over HTML5 WebSockets?
Related
It's needed to make http requests to appropriate urls each 30 minute. Currently my RoR server handles bunch of simple routes, and I don't wanna my sending service to blocks main routing, in other words - make requests from queue only when there is no other job. I have no idea how Ruby handles threads(came from js world) and don't want to dig deep, just wanna know should I care about it or not? Thanks.
I want to test a set of ruby-on-rails applications. Specifically, I want to trigger all possible GET/POST requests available. I am considering using some web crawler-like tool, which could (recursively) send requests to my web server, get responses, and parse the response HTML file to get all possible "href tags", "form submission buttons", etc.
Essentially I want to see the performance of these web applications and get some logs of things like what are the request routes, parameters, database accesses, queries, transactions, etc.
Sending GET requests is relatively easy to handle, I would need to simply parse the HTML response and extract the href attributes of all anchors. However, I don't know how to handle those POST requests; they would require me to fill in all these parameter fields included in the form fields. I am wondering if there exist some tools doing such work. Or some tools I can easily modify (not too much) code to achieve my functionality?
Thanks a lot.
I'm wondering what the best way to go about developing a rails application with the following features:
All of the data comes from a SOAP request to a 3rd party
A background task will make this soap request every ~10s
The background task will parse the response and then update an ActiveRecord model accordingly
The data isn't written to a database at all, if the app fails, when we start it back up the data will come from the soap request again
Users will make a request to the app which will simply show data in the model (i.e. from the soap request).
The idea is to avoid making the SOAP request for every single user as the data won't change that frequently. Not using a database avoids reading and writing of data that only ever comes from the request anyway.
I imagine that all of this can be completely quite simply with a few gems but I've had a bit of trouble sorting through what meets my requirements and what doesn't.
Thanks
I'm not sure what benefit you're getting from using ActiveRecord in this case.
Perhaps consider some other type of persistance for the SOAP calls?
If the results form the WebService are really not changing, I would recommend the Rails caching mechanism. Wherever in your Rails app, you can do:
Rails.cache.fetch "a_unique_cache_key" do
... do your SOAP request and return the result
end
This will do the work within the block just once and fetch its result from the rails cache store in the future.
The cache store be of various types (one of which is the memcache store). I usually go with the file store for medium traffic sites, but you may choose another:
http://guides.rubyonrails.org/caching_with_rails.html
The problem resides on building an architecture with Backbone and Rails
that handles syncing multiple actions to the server.
Assume the model is define on both Rails and Backbone.
I have an update and destroy operations on a model and I need them to synced
with the server on a user action (button click). On another part of the webapp,
these actions on this same model are synced on the moment they
made (easy, just send a restful ajax http request).
But in the first case, I can't really figure out an easy, stateless and atomic/transactional
save of the several actions (requests) the user took.
Sending multiple requests to the server makes the save non-atomic and a bit of non stateless.
Sending one big request with the actions formatted makes parsing on the server necessary.
So, is there another better solution?
If you want multiple updates, on different resources, as one atomic transaction, that is not REST.
So, of course, you will have to orchestrate the parameters and the requests in Rails. (but it's not about parsing, since you'll send JSON, more about creating a format for the aggregated parameters and figuring out what to do on the Rails side).
A nice way to handle multiple requests at once is at https://github.com/railscasts/414-batch-api-requests
I am looking for the easiest, simplest way to access web APIs that return either JSON or XML, with concurrent requests.
For example, I would like to call the twitter search API and return 5 pages of results at the same time (5 requests). The results should ideally be integrated and returned in one array of hashes.
I have about 15 APIs that I will be using, and already have code to access them individually (using simple a NET HTTP request) and parse them, but I need to make these requests concurrent in the easiest way possible. Additionally, any error handling for JSON/XML parsing is a bonus.
I'd recommend Weary.
It handles multiple simultaneous asynchronous requests by spawning a thread for each request, and with it you can write API connectors that are readable and DRY. On top of that it has a built in .parse method which works with JSON or XML responses.