My front-end is a Rails application, and my backend is a Padrino application.
I want to fetch a large amount of CSV data from the backend. Doing makes the HTTP request timeout in the browser.
I can not query the backend again and again, because each time it will generate the same data, there is no concept of offset and limit records.
I tried sending directly from the backend to LB but it is not working for me.
To summarize, an array of 10000k rows is generated to be sent to the UI or downloaded in a streaming fashion.
Related
Currently my app pulls data from a custom api, parses it and saves the data to multiple arrays.
I am using AWS RDS to store all the data which is displayed on the api, and using AWS EC2 to host the file to access the api.
The problem I have ran into is that each download of the api is ~1mb and AWS charges $0.09/GB of data. I need to lower costs and so I can't have my app pulling the api data every time the refresh function is called. (my api updates every 4 hours. If users refresh the app before my api has updated, the refresh function will do nothing).
My current idea to solve this is either:
(1)download the json data onto the device, then parse & save the offline data to arrays
(2)or download and parse it into arrays, then save those arrays locally (from searching I believe I need to use NSKeyedArchiver or UserDefaults?)
I am not sure what the best approach to this is.
Scenario:
I have a Rails API with a /tasks.json endpoint that serves all the Tasks.
For large Accounts this can sometimes be close to 1,000,000 records.
Sending 1,000,000 records over the wire in JSON generally doesn't work due to a network timeout, etc.
I can easily "paginate" the number of tasks being sent so the endpoint uses something like /tasks.json?limit=10000&page=1 but how does the client know to send this?
Is there a "standard" way to handle large requests like this and break them up naturally into chunks?
Does each client need to handle this on their end manually?
Thanks!
You should use kaminari gem. It handles all requests and paginates it. It works both on Rails Api apps and rails standard apps.
https://github.com/kaminari/kaminari
I am working on xamarin ios. I am using webservices to get the data and post the data. When I am trying to upload large data nothing happens. In that case service call never reach on server. If I upload few data then it works fine.
So I just want to confirm is there any limitation to upload data from Iphone app with help of webservices? If not then what may be the cause that in case of large data my service call never reach on server?
For RESTful service:
There is no limit for POST & PUT type of request, while sending data
with web service. But if you use, GET type of request, then there is
limit of data with request.
For GET, limit of data size, depends on type of server.
- For Tomcat - default is 2MB (Upgradable**)
- For PHP - default is 2MB (Upgradable**)
- For Apache - default is 2MB (Upgradable**)
**Upgradable: For more details, click on links.
Oracle - Web service development guideline
Note:
If you are uploading large size of files, then use Multipart Upload method/technique to send your data to web server.
Check request time out interval set with web service request. Large size data require more time than normal request.
Use asynchronous requests, that works in background state for application, for better data transmission, without interrupting user interaction with content of application UI elements.
I'm wondering what the best way to go about developing a rails application with the following features:
All of the data comes from a SOAP request to a 3rd party
A background task will make this soap request every ~10s
The background task will parse the response and then update an ActiveRecord model accordingly
The data isn't written to a database at all, if the app fails, when we start it back up the data will come from the soap request again
Users will make a request to the app which will simply show data in the model (i.e. from the soap request).
The idea is to avoid making the SOAP request for every single user as the data won't change that frequently. Not using a database avoids reading and writing of data that only ever comes from the request anyway.
I imagine that all of this can be completely quite simply with a few gems but I've had a bit of trouble sorting through what meets my requirements and what doesn't.
Thanks
I'm not sure what benefit you're getting from using ActiveRecord in this case.
Perhaps consider some other type of persistance for the SOAP calls?
If the results form the WebService are really not changing, I would recommend the Rails caching mechanism. Wherever in your Rails app, you can do:
Rails.cache.fetch "a_unique_cache_key" do
... do your SOAP request and return the result
end
This will do the work within the block just once and fetch its result from the rails cache store in the future.
The cache store be of various types (one of which is the memcache store). I usually go with the file store for medium traffic sites, but you may choose another:
http://guides.rubyonrails.org/caching_with_rails.html
I am searching for a good solution to manage data updates through REST and JSON.
The client is an iPad and I want that client staying up2date.
The problem is, that the amount of data is very large and may be changed often.
Lets assume I have customer data in the backend. The client iPad should synch this data with the backend system. But customer data may be changed or deleted at any time. Further the amount of customers is >1000.
I do not really want the client to connect to http://www.example.com/customers/ causing to send a request for each of the 1000 customers in that list...
Any ideas to solve such a problem nicely?