In my rails app I have answer_controller and connect_controller.
app >controllers >answer_controller and app >controllers >connect_controller
My connect_controller has below code.
#url = api_version_root+'/members/all?council='+session[:council]
response = RestClient.get #url, api_token_hash
if response.code == 200
#members = JSON.parse(response.body)
end
I want to access #members within answer_controller. How can I do this.
A controllers instance variables are only present for a single request and only one controller action is called per request in Rails by design. This is also generally true of any framework/platform. If you want to persist anything between requests you need to either pass it along or store it somewhere as the thread responding to a request is terminated when it finishes serving a request and its variables are gone with it.
There are many ways to pass data from back and forth between the client and server like query string parameters, cookies and the session (stored in cookies by default). The size is strongly limited by the client such as the cookie size limit of roughly 4096 bytes and the defacto size limit of 2000 characters for URLs.
You can persist data on the server by using rails built in cache mechanism, a database, memory based storage (such as Memcached and Redis) or the servers file system.
Which to use depends on exactly how its being used, the size of the data and what architecticure you have in place.
[MVC Model] On request, just one Controllers#action is called.
If you want to reuse a variable, put it to session
session[:members] = #members
I would use Rails caching, but you must think carefully about how to expire the cache. So:
#members = cached_members
private
def cached_members
#url = api_version_root+'/members/all?council='+session[:council]
Rails.cache.fetch("#{#url}/#{api_token_hash}/members", expires_in: 48.hours) do
response = RestClient.get #url, api_token_hash
(response.code == 200) && JSON.parse(response.body)
end
end
Then duplicate this in the answer controller, and #members will be populated from the cache. Now, of course it's poor practice to actually duplicate code, so pull it out into a helper mixin, like:
module MemberCache
def cached_members
#url = api_version_root+'/members/all?council='+session[:council]
Rails.cache.fetch("#{#url}/#{api_token_hash}/members", expires_in: 48.hours) do
response = RestClient.get #url, api_token_hash
(response.code == 200) && JSON.parse(response.body)
end
end
end
and include this in the two controllers:
require 'member_cache'
class ConnectController < ApplicationController
include MemberCache
end
Related
I have an API function defined in my rails controller, and another to my database that I create using scaffold.
`
def function
#results = HTTParty.get( $baseurl + "/extention", :headers => {
$apikey => $apivalue,
"Content-Type" => "application/json"
})
render json: #results.body
end
`
I have defined a rake task that is executed with clockwork but to make it work in the current development environment I had to use httparty to call it internally and work with the received hash.
`
def rake_function
#results = HTTParty.get("http://localhost:3000/controller/extension/")
return #results.parsed_response
end
In addition, my task makes a post and a put when the execution is finished and I must also use a httparty for that.
if (!datExist)
#response = HTTParty.post("http://localhost:3000/controller/extension/",
body: eodData
)
else (datExist)
checkId = dbData.select {|x| x["date_time"] == yesterday}
id = checkId[0]["id"].to_s
#response = HTTParty.put("http://localhost:3000/controller/extension/" + id,
body: eodData
)
end
`
I know it's not the optimal way so I would like to be able to execute the function already defined in my controllers in my rake task
You don't.
The only public methods of your controller should be the "actions" that are called by the Rails router when it responds to HTTP requests.
Controllers are Rack applications and have a hard dependency on a incoming HTTP request - they just don't work outside of that context like in Rake task and using your controllers as a junk drawer leads to the "Fat Controller" anti-pattern.
Controllers already have tons of responsibilites - they are basically stiching your entire application together by passing user input to models and models to views. Don't give them more jobs to do.
Its a very easy problem to avoid by simply moving your API calls into their own class. One way of designing this is through client classes which are just resposible for communicating with the API:
class MyApiClient
include HTTParty
format :json
base_url $baseurl # Code smell - avoid the use of globals
attr_reader :api_key, :api_value
def initalize(api_key:, api_value:)
#api_key = api_key
#api_value = api_value
end
def get_extension
# I don't get why you think you need to dynamically set the header key
self.class.get('extension', headers: { api_key => api_value })
end
end
This lets you simply re-use the code between your controller and rake task and isolates the code touching the application boundy which avoids creating a tight coupling between your application and the external collaborator.
HTTParty also really shines when you actually use it for object oriented and not proceedural code.
I have built an app that consumes a json api. I removed active record from my app because the data in the api can theoretically change and I don't want to wipe the database each time.
Right now I have a method called self.all for each class that loops through the json creating ruby objects. I then call that method in various functions in order to work with the data finding sums and percentages. This all works fine, but seems a bit slow. I was wondering if there is somewhere I should be storing my .all call rather than instantiating new objects for each method that works with the data.
...response was assign above using HTTParty...
def self.all
puppies = []
if response.success?
response['puppies'].each do |puppy|
accounts << new(puppy['name'],
puppy['price'].to_money,
puppy['DOB'])
end
else
raise response.response
end
accounts
end
# the methods below only accept arguments to allow testing with Factories
# puppies is passed in as Puppy.all
def self.sum(puppies)
# returns money object
sum = Money.new(0, 'USD')
puppies.each do |puppy|
sum += puppy.price
end
sum
end
def self.prices(puppies)
prices = puppies.map { |puppy| puppy.price }
end
def self.names(puppies)
names = puppies.map { |puppy| puppy.name }
end
....many more methods that take an argument of Puppy.all in the controller....
Should I use cacheing? should I bring back active record? or is how I'm doing it fine? Should I store Puppy.all somewhere rather than calling the method each time?
What I guess is happening is that you are making a request with HTTParty every time you call any class method. What you can consider is creating a class variable for the response and a class variable called expires_at. Then you can do some basic caching.
##expires_at = Time.zone.now
##http_response
def make_http_call
renew_http_response if ##expires_at.past?
end
def renew_http_response
# make HTTParty request here
##http_response = # HTTParty response
##expires_at = 30.minutes.from_now
end
# And in your code, change response to ##response
# ie response.success? to ##response.success?
This is all in memory and you lose everything if you restart your server. If you want more robust caching, the better thing to do would probably to look into rails low-level caching
I am building a sample rails 4 app and I'm unclear about something. I want to access an external API to pull data on sports news via an ajax call.
So for example if you have a list of teams in the teams#index view, when you click on one team a widget will get populated with the latest results / scores for that team-- the results info is provided by an external API service, not the local database.
Do I need to create a controller for this service to allow the rails ajax request to have a local endpoint? Should the actual request mechanism happen in this controller? or would it be better to build a helper for the data request and call that from the controller?
On the other hand it's possible to do it all via javascript in the browser.
Thanks-- I realize there's a dozen ways to do things in rails, I'm just unclear on the "right" way to handle this type of situation.
I tend to do this with a helper module that you can unit test independently. To give you a similar, trivial example, here's a module that you could use to wrap the Gravatar API:
# /lib/gravatar.rb
module Gravatar
def self.exists email
url = self.image_url email
url = url + '?d=404'
response = HTTParty.get url
return response.code != 404
end
def self.image_url email, size=nil
gravatar_id = self.gravatar_id email
size_url = size ? '?s=' + size.to_s : ''
"http://gravatar.com/avatar/#{gravatar_id}.png" + size_url
end
def self.gravatar_id email
Digest::MD5::hexdigest(email.downcase)
end
end
Then, you can make a call to Gravatar::image_url as necessary. If you wanted to be able to access a Gravatar image via an ajax call, you could simply wrap it in a controller:
# /app/controllers/api/users_controller.rb
class Api::UsersController < Api::BaseController
def gravatar_for_user_id
user = User.find_by_id(params[:id])
render plain: Gravatar::image_url user.email, :status => 200
end
end
This model can be applied to whatever external APIs you need to hit, and modularizing your interface will always make unit testing more straightforward.
Is there a way to pre-build a page cache without calling the actual page via a http request?
I looked at solutions like this and this, but these don't generate the cache.
I have a relatively complicated view, and want to cache the entire thing. I want to pre-build this cached version in the application so when a user actually hits it, it will already be there.
Thanks
We had a need to do something similar from a rake task -- we had a partial that would need to display a very long list of entities (~700) which were somewhat context specific and which, due to a series of database structure issues and custom sorting criteria, would easily take > 25 seconds to render the first time before going into cache> This would often time out because our HTTP servers were set to terminate HTTP requests with no response after 30 seconds, and pre-caching this custom list was a solution.
What you need to do is create an instance of ActiveController::Base, or of one of your controllers if you need helper methods or other entities, then pass its lookup_context reference to a new instance of ActionView.Renderer.
In our rake task, we did the following
namespace :rake_for_time_consuming_nonsense do
task :pre_cache_long_list do
PreCacher.pre_fetch_partials
end
end
class PreCacher
def self.pre_fetch_partials
the_controller = ActionController::Base.new
# Set any instance variables required by your partial in the controller,
# they will be passed to the partial with the view_context reference
the_controller.instance_variable_set "#cache_key", cache_key
the_controller.instance_variable_set "#the_object", MyModel.first
view_renderer = ActionView::Renderer.new the_controller.lookup_context
view_renderer.render the_controller.view_context, {partial: 'my_model/the_partial', layout: false}
end
end
This works in Rails 3.2.13.
I think following link should give you a good start.
How do I get the rendered output of a controller's action without visiting the web page?
I try to accomplish the same and as far i can see, your fake request should have the correct host, because the cache-key includes host informations.
I accomplished caching by using ActionController::Integration::Session
ais = ActionController::Integration::Session.new
ais.host = host
ais.xml_http_request(:post, url, params, headers)
I'v got another one:
class FakeRequest
include ActionController::UrlWriter
def initialize(url, params, session, host)
#url = url
#params = params
#session = session
default_url_options[:host] = URI.parse(host).host
end
def post
process(:post)
end
def get
process(:get)
end
def xhr
process(:post, true)
end
def process(method, ajax = false)
uri = URI.parse(url_for(#url))
request = ActionController::TestRequest.new({'HTTP_HOST' => uri.host,'rack.input' => '','rack.url_scheme' => 'http'})
request.query_parameters = #params
request.path = uri.path
request.host = uri.host
request.env['REQUEST_METHOD'] = method.to_s.upcase
if ajax
request.headers['X-Requested-With'] = 'XMLHttpRequest'
end
#session.each_pair do |k,v|
request.session[k] = v
end
response = ActionController::TestResponse.new
controller = ActionController::Routing::Routes.recognize(request).new
return controller.process(request, response)
end
end
This will also return the response object.
I have a Sinatra class in a Rails project. It uses eventmachine and async_sinatra to make asynchronous calls to external sites. I'd like to write to a session object (ideally, the same one that Rails is using), but so far I can only:
write to a separate session object from Rails' (by default, Sinatra names its session something different from Rails)
write to the same session for synchronous calls only
When I make asynchronous calls, sessions written in the async_sinatra code don't get pushed out to the client machine. I suspect one of two things is happening:
The header has already been sent to the client and the local variable storing the session (in Sinatra) will be flushed out at the end of the action. The client would never see a request from the server to save this data to a cookie.
The header is being sent to the client, but Rails immediate sends another, instructing the client to write to the cookie what Rails has stored in its session variable, overwriting what Sinatra wrote.
Either way, I'd like to just get simple session functionality in both Sinatra and Rails. An explanation of what I'm doing wrong would also be nice :)
A full working copy of the code is on github, but I believe the problem is specifically in this code:
class ExternalCall < Sinatra::Base
use ActionDispatch::Session::CookieStore
register Sinatra::Async
get '/sinatra/local' do
session[:demo] = "sinatra can write to Rails' session"
end
aget '/sinatra/goog' do
session[:async_call]="async sinatra calls cannot write to Rails' session"
make_async_req :get, "http://www.google.com/" do |http_callback|
if http_callback
session[:em_callback] = "this also isn't saving for me"
else
headers 'Status' => '422'
end
async_schedule { redirect '/' }
end
end
helpers do
def make_async_req(method, host, opts={}, &block)
opts[:head] = { 'Accept' => 'text/html', 'Connection' => 'keep-alive' }
http = EM::HttpRequest.new(host)
http = http.send(method, {:head => opts[:head], :body => {}, :query => {}})
http.callback &block
end
end
end
EDIT 7/15:
Changed code on Github to include Async-Rack. Async-sinatra can write to sessions when they are not shared with Rails. Compare the master and segmented_sessions branches for behavior difference. (Or on the master branch, change use ActionDispatch::Session::CookieStore to enable :sessions)
This is because async_sinatra uses throw :async by default, effectively skipping the session middleware logic for storing stuff. You could override async_response like that:
helpers do
def async_response
[-1, {}, []]
end
end