Best wrapper for simultaneous API requests? - ruby-on-rails

I am looking for the easiest, simplest way to access web APIs that return either JSON or XML, with concurrent requests.
For example, I would like to call the twitter search API and return 5 pages of results at the same time (5 requests). The results should ideally be integrated and returned in one array of hashes.
I have about 15 APIs that I will be using, and already have code to access them individually (using simple a NET HTTP request) and parse them, but I need to make these requests concurrent in the easiest way possible. Additionally, any error handling for JSON/XML parsing is a bonus.

I'd recommend Weary.
It handles multiple simultaneous asynchronous requests by spawning a thread for each request, and with it you can write API connectors that are readable and DRY. On top of that it has a built in .parse method which works with JSON or XML responses.

Related

In a Rails API, is there a standard way to "paginate" or "chunk" large API requests responses?

Scenario:
I have a Rails API with a /tasks.json endpoint that serves all the Tasks.
For large Accounts this can sometimes be close to 1,000,000 records.
Sending 1,000,000 records over the wire in JSON generally doesn't work due to a network timeout, etc.
I can easily "paginate" the number of tasks being sent so the endpoint uses something like /tasks.json?limit=10000&page=1 but how does the client know to send this?
Is there a "standard" way to handle large requests like this and break them up naturally into chunks?
Does each client need to handle this on their end manually?
Thanks!
You should use kaminari gem. It handles all requests and paginates it. It works both on Rails Api apps and rails standard apps.
https://github.com/kaminari/kaminari

batch url execution with redstone

I am creating a rest api with redstone and dart.
Now I need to create a way to send several commands of the api in one POST.
Edit (#Pacane): Yes, the idea is to have one endpoint that can parse several commands within the api. Each command is another endpoint(Route) that can be accessed singularly within the API.
My question is how I can parse, executes the URLs and get the response for each URL using redstone.
I am thinking about using the unittest mock request and mock response. You can see here:
https://github.com/redstone-dart/redstone/wiki/Unit-test
Now I wonder if the mock request and response are different somehow from regular requests, if they use some analysis or something like it that would harm performance.
So, there are 2 questions:
1) is it ok to mock the redstone to parse several URLs?
2) if not, how could I do it?
thanks
I ended up using the MockRequest and MockResponse to achieve the batch commands. You can know more about it at the wiki: https://github.com/redstone-dart/redstone/wiki/Unit-test

Best practices to make lengthy / multiple AJAX calls to Rails backend

Let's say my front-end makes a ton of AJAX calls to the ruby backend for JSON objects.
This is open ended, but from your experiences, which is the better option? I'm concerned with 1. performance, and 2. style
Option 1: Make one AJAX request to backend that returns a lengthy JSON string
Option 2: Break down the request into several AJAX requests that each returns a shorter JSON string.
Or an Option 3? I'll take into consideration other alternatives.
Thanks!
I think if you are making frequent ajax request than you can try web sockets as it is very widely used for such applications these days. Here is a link that makes a very good comparison between different technologies used for such requirements: In what situations would AJAX long/short polling be preferred over HTML5 WebSockets?

Rails or Sinatra app - how to maintain background threads

I'd like to maintain a data structure on a Sinatra or Rails server (doesn't matter) that is accessible for all HTTP requests that arrive to it (i.e. to support concurrent modification). I don't want to rely on a database or similar because that doesn't allow me to code callbacks for the modification of this data structure and put concurrent blocks on the HTTP response threads.
Since HTTP is stateless there's apparently no easy way to achieve this.
How can I make a process to maintain data in the background for all the requests that arrive to an HTTP server without reliying on external programs and middleware? Does it require me to modify Rails or Sinatra to achieve this? Is there any alternative even outside ruby?
When using Sinatra, you can just code in a thread at the end of your application:
http://blog.markwatson.com/2011/11/ruby-sinatra-web-apps-with-background.html
Using this, you could maintain a worker that would do things even as http requests come in and out.
Sinatra also has the methods before and after which run before and after each request, respectively.
So if you wanted to add data to a data structure before each request is handled you could:
before do
puts request
end
Using these tools, you can easily achieve what you want to do.

How to stream JSON response from MVC controller method?

My company has a Phonegap application that's running into errors with UIWebView; it's hitting the 10MB memory cap. We're using .NET MVC4 as our backend.
This is due to a very large JSON response from a sync call. The JSON string (approximately 12 megs of data) is saved to memory, which causes a memory exception. Due to the way our application works, we have to send this data.
I'm considering using Oboe.JS to stream in the JSON response (to avoid the allocation of the full JSON object). I think that this may fix the issue on the frontend, but I'd like to stream the JSON response from the backend. I know that there are chunking methods, but I'd prefer not to use that option.
Currently, our controller method produces an object, serializes it to JSON, and sends the entire response. I would like to open a one-way HTTP stream, if possible, and asynchronously write JSON data to this stream (which Oboe would parse on the client side).
Are there any technologies that fit my use case, or can someone point me to some methods I may use to accomplish this? I don't need a two-way conduit - I just want to write the JSON to the HTTPstream as objects are produced.
Thank you.

Resources