I have an active resource model in one of my applications, and I need to be able occasionally do a find(:all), and force it to repull the data from the remote service. How can I do this? I saw the connection(refresh=true) piece, but I don't want it to refresh EVERY SINGLE TIME. More like I just want to be able to flush the cache when I want to, or to force a particular transaction to repull from the remote.
You might check out cached_resource. I am not sure how you are caching currently. Cached resource caches the responses to requests made with active resource. At the moment it seems to cache every request that goes through active resource, but allows you to refresh a specific request by doing:
MyActiveResource.all(:reload => true).
As far as I know, ActiveResource doesn't do any caching and will pull from the remote service every time you do find(:all).
Related
ok, first time making an API!
My assumption is that if data needs to be stored on the back end such that it persists across multiple API calls, it needs to be 1) in cache or 2) in a Database. is that right?
I was looking at the code for the gem "google-id-token". it seems to do just what i need for my google login application. My front end app will send the google tokens to the API with requests.
the gem appears to cache the public (PEM) certificates from Google (for an hour by default) and then uses them to validate the Google JWT you provide.
but when i look at the code (https://github.com/google/google-id-token/blob/master/lib/google-id-token.rb) it just seems to fetch the google certificates and put them into an instance variable.
am i right in thinking that the next time someone calls the API, it will have no memory of that stored data and just fetch it again?
i guess its a 2 part question:
if i put something in an #instance_variable in my API, will that data exist when the next API call comes in?
if not, is there any way that "google-id-token" is caching its data correctly? maybe HTTP requests are somehow cached on the backend and therefore the network request doesnt actually happen over and over? can i test this?
my impulse is to write "google-id-token" functionality in a way that caches the google certs using MemCachier. but since i dont know what I'm doing i thought i would ask.? Maybe the gem works fine as is, i dont know how to test it.
Not sure about google-id-token, but Rails instance variables are not available beyond single requests and views (and definitely not from one user's session to another).
You can low-level cache anything you want with Rails.cache.fetch this is put in a block, takes a key name, and an expiration. So it looks like this:
Rails.cache.fetch("google-id-token", expires_in: 24.hours) do
#instance_variable = something
end
If the cache exists and it is not past its expiration date/time, Rails grabs it from the cache; otherwise, it would make your API request.
It's important to note that low-level caching doesn't work with mem_store (the default for development) and so you need to implement a solution with redis or memcached or something like that for development, too. Also, make sure the file tmp/cache.txt exists. You can run rails dev:cache or just touch it to create it.
More on Rails caching
I'm actually working on a Rails Application API. I have models with a lot of data.
I would like to be able from my client application to send like a Token, a timestamp or whatever to get the new, updated, deleted content/datas since the last request. While providing a new Token, timestamp for the futur request.
In that way, I have just to update my local cached content on the client side depending on the result of the request rather than update all of my local datas at each request.
After many researches on Google, I didn't find anything convincing.
I don't now how can I manage that on the server side ? Is that a good practice ?
If yes what's the best way to do it ?
Ruby on Rails 4.1.4
I made an interface to a Twitch gem, to fetch information of the current stream, mainly whether it is online or not, but also stuff like the current title and game being played.
Since the website has a lot of traffic, I can't make a request every time a user walks in, so instead I need to cache this information.
Cached information is stored as a class variable ##stream_data inside class: Twitcher.
I've made a rake task to update this using cronjobs, calling Twitcher.refresh_stream, but naturally that is not running within my active process (to which every visitor is connecting to) but instead a separate process. So the ##stream_data on the actual app is always empty.
Is there a way to run code, within my currently running rails app, every X minutes? Or a better approach, for that matter.
Thank you for your time!
This sounds like a good call for caching
Rails.cache.fetch("stream_data", expires_in: 5.minutes) do
fetch_new_data
end
If the data is in the cache and is not old then it will be returned without executing the block, if not the block is used to populate the cache.
The default cache store just keeps things in memory so doesn't fix your problem: you'll need to pick a cache store that is shared across your processes. Both redis and memcached (via the dalli gem) are popular choices.
Check out Whenever (basically a ruby interface to cron) to invoke something on a regular schedule.
I actually had a similar problem with using google analytics. Google analytics requires that you have an API key for each request. However the api key would expire every hour. If you requested a new api key for every google analytics request, it'd be very slow per request.
So what I did was make another class variable ##expires_at. Now in every method that made a request to google analytics, I would check ##expires_at.past?. If it was true, then I would refresh the api key and set ##expires_at = 45.minutes.from_now.
You can do something like this.
def method_that_needs_stream_data
renew_data if ##expires_at.past?
# use ##stream_data
end
def renew_data
# renew ##stream_data here
##expires_at = 5.minutes.from_now
end
Tell me how it goes.
I'm curious about NSURLCache, NSURLRequest has some policy of cache,
like NSURLRequestUseProtocolCachePolicy, NSURLRequestReturnCacheDataElseLoad,
but after read them, either of them are using local cache data, or not using cache,
My question is if I want start a url request, first I wanna load cache and render ui, then continue interacting with server to grab latest data and refresh the ui, which policy is my choice?
If I understand correctly what you want to achieve (get data quickly from the cache to show in the UI even if it is outdated, then get the current data even if it is slow), you'd have to make two requests, using different cache policies. I'd start the second request only when the first one has finished, and examine the result of the first request first, because the data might not have been available in the cache, so the first request might already returned the uncached data that you wanted.
I have a rails app that people can send data to in the query params of a url. The rails app then validates the correctness of the data and creates a json reponse listing any detected errors. The validation itself is done by checking the data against a set of rules that live in a github repo.
Ideally I'd like to update my local copy of this repo once a day. In order to prevent complications I'd like any requests that come in while this update takes place to back off for a few seconds.
What's the best way to communicate to the incoming requests that an update is currently occuring? I'm using a process based webserver (unicorn), so memory mutexes don't seen like the right answer :(.