Using Rails 3.2.14. I'm streaming a controller action response, as follows.
def my_controller_action
stream = StreamingClass.new(*args) #responds to each
response.sending_file= true
headers.merge!(
'Content-Disposition' => 'inline',
'Content-Transfer-Encoding' => 'binary',
'Cache-Control' => 'no-cache'
)
self.status = 200
self.content_type= 'application/json'
self.response_body = stream
end
The streaming works just fine, but the problem is that the controller action returns before the streaming is completed (i.e. before each is called on the 'stream' object). It basically returns immediately after assigning the 'stream' object to self.response_body.
I'm using the lograge gem to tidy up our logging. Lograge basically subscribes to the 'process_action.action_controller' notifications. It is logging the timings (i.e. duration, db_runtime, etc...) based on the actual controller return time, without tracking any time spent on the stream object code.
The heavy lifting occurs in a StreamingClass method, but I'm completely missing this info from the logs. Is there some way to include the streaming response timings in the logs?
I'm running into this same issue also. It seems to me that the only way to tell when the streaming is complete is to wrap the response_body in a proxy object whose close method includes additional logic which records the stats you need. You could probably use something like this:
class BodyProxy
def initialize(body)
#body = body
end
def each(&block)
#body.each(&block)
end
def close
#body.close if #body.respond_to?(:close)
# Your code here. Probably something involving `Time.now`
end
end
This works because the Rack specification requires that close be called on the body "after iteration".
I'm unfamiliar with Lograge, so I don't know how you would send this information to that gem, but this should be enough to get you started.
Related
I am pretty new to SSE so feel free to let me know if I've misunderstood the purpose and there's a much better way of implementing what I want!
I have a working SSE that, every minute, updates a user's dashboard. The code looks like this:
# SitesController
def dashboard
end
def regular_update
response.headers['Content-Type'] = 'text/event-stream'
sse = SSE.new(response.stream, event: 'notice')
begin
sse.write(NoticeTask.perform) # custom code returning the JSOn
sleep 60
rescue ClientDisconnected
ensure
sse.close
end
end
# routes
get "/dashboard(/:id)" => "sites#dashboard"
get "/site_update" => 'sites#regular_update'
# view - /dashboard
var source = new EventSource('/site_update');
source.addEventListener('notice', function(event) {
var data = JSON.parse(event.data)
appendNoticeAndAlert(data)
});
This works just fine. When I'm on /dashboard for a user, the right info is being updated regularly by the SSE, great!
However, I notice if I'm on any random page, like just the home page, the SSE is still running in the background. Now... obviously this makes sense, since there's nothing in the code that is otherwise limiting that... but shouldn't there be??? Like shouldn't there be a way to scope the SSE in some way? Isn't it a huge waste of resources if the user is never on the /dashboard for the SSE to be constantly working in the background, updating the /dashboard page?
Again, new to SSE, if this is fundamentally wrong, please advise as well. Thanks!
In your controller when handling SSE you're expected to do updates in a loop,
then ActionController::Live::ClientDisconnected is raised by response.stream.write once client is gone:
def regular_update
response.headers['Content-Type'] = 'text/event-stream'
sse = SSE.new(response.stream, event: 'notice')
loop do
sse.write(NoticeTask.perform) # custom code returning the JSOn
sleep 60
end
rescue ClientDisconnected
logger.info "Client is gone"
ensure
sse.close
end
your code disconnects the client after first update and delay, but everything appears to be working because EventSource automatically reconnects (thus you're getting long-polling updates).
On client EventSource should be close()d once it is not needed. Usually it is done automatically upon navigation away from page containing it, so:
make sure that eventsource javascript is only on the dashboard page, not in javascript bundle (or is in bundle, but only enabled on specific page)
if you're using turbolinks - you have to close() the connection manually, as a quick solution - try adding <meta name="turbolinks-visit-control" content="reload"> to page header or disabling turbolinks temporarily.
Also think again whether you actually need SSE for this specific task, because for plain periodic updates you can just poll a json action from client side code, that will render the same data. This will make controller simpler, will not keep connection busy for each client, has wider server compatibility etc.
For SSE to be reasoned - at least check if something has really changed and skip message if there's nothing. Better way is to use some kind of pub-sub (like Redis' SUBSCRIBE/PUBLISH, or Postgres' LISTEN/NOTIFY) - emit events to a topic every time something that affects the dashboard changes, subscribe on SSE connect and so on (may be also throttle updates, depends on your application). Similar can be implemented with ActionCable (is a bit overkill, but can be handy, since already has pub-sub integrated)
I am trying to fetch data from Twitter Streaming API in my rails app and i have created my own module which gives me twitter Authorization Head. I am able to get authorized but i am not getting the response back... all i see is the request in pending state (i am guessing as its streaming and connection not being closed). What can i do to my below code so that i can start printing the response as i get from Streaming API ?
class MainController < ApplicationController
include Oauth::Keys
def show
#oauth_signature_string = Oauth::Signature.generate(signature_params)
#header = Oauth::HeaderString.create(header_params)
RestClient::Request.execute(method: :GET, url: 'https://stream.twitter.com/1.1/statuses/sample.json', :headers => {:Authorization => %Q(OAuth ****************************)} )
end
end
Because Twitter is "streaming" the data and not closing the connection, your RestClient request is not being ended and your show action is hanging there and not "finishing". So, rails can't continue and render the default main/show.html.erb page.
So, you might want to look into ActionController::Streaming class and see if you can rewrite you views and HTTP call to utilize it. Or, it would be much easier to use a non-streaming API edge.
Also, what you are doing seems to be a better fit for javascript. You might want to use Twitter's official Javascript api to do all authentications and status streams.
API requests take too long and are costing me money in my Rails integration tests and my application.
I would like to save API responses and then use that data for testing. Are there any good ways to make that happen?
Also, how can I make fewer api calls in production/development? What kind of caching can I use?
If I understand correctly, your rails app is using an external api, like a google/fb/twitter api, this kind of stuff
Caching the views won't work, because it only caches the template, so it doesn't waste time rendering the view again, and it validates that the cache is warm by hashing the data, which the code will still hit the api to verify that the hashes still match
For you the best way is to use a class that does all the api calls, and cache them in rails cache and give that cache a timeout period, because you don't want your cache to be too stale, but in the same time you will sacrifice some accuracy for some money ( like only do a single call every 5, 15, 30 mins, which ever you pick )
Here's a sample of what I have in mind, but you should modify it to match your needs
module ApiWrapper
class << self
def some_method(some_key) # if keys are needed, like an id or something
Rails.cache.fetch("some_method/#{some_key}", expires_in: 5.minutes) do
# assuming ApiLibrary is the external library handler
ApiLibrary.call_external_library(some_key)
end
end
end
end
Then in your code, call that wrapper, it will only contact the external api if the stored value in the cache is expired.
The call will be something like this
# assuming 5 is the id or value you want to fetch from the api
ApiWrapper.some_method(5)
You can read more about caching methods from the rails guide for caching
Update:
I just thought of another way, for your testing (like rspec tests) you could stub the api calls, and this way you'll save the whole api call, unless you are testing the api it self, using to the same api library I wrote above, we can stub the ApiLibrary it self
allow(ApiLibrary).to receive(:some_method).and_return({ data: 'some fake data' })
PS: the hash key data is part of the return, it's the whole hash not just the string.
There is a great gem for this called VCR. It allows you to make a single request and keep response cached, so every time you run the test you will use this saved response.
I would use http://redis.io/ in conjunction with something like jbuilder. So as an example your view would look like:
json.cache! ["cache", "plans_index"] do
json.array! #plans do |plan|
json.partial! plan
end
end
for this controller:
def index
#plans = Plan.all
end
If you have something that is a show page you can cache it like this:
json.cache! ["cache", "plan_#{params["id"]}"] do
json.extract! #plan, :short_description, :long_description,
end
I have the following code:
def get_request(resource)
request = Typhoeus::Request.new("#{#BASE_URL}#{resource}",
userpwd: "#{#USER}:#{#PWD}",
headers: { 'Content-Type' => "application/x-www-form-urlencoded"})
response = request.run.body
puts response
end
Instead of puts response, I want to log the entire response. What's the best/most efficient way to do that? Regardless of what response is, it should be logged. I feel like opening a file, writing to it and closing it every time this method is used would be pretty inefficient. Is there a better way?
If you are using Rails as the tag supposes, you can use
Rails.logger
to use the default Rails logger. Here's an example.
Rails.logger.info response.body
I'm working on a Ruby on Rails app that relies on my app making some simple URL calls for user metrics. For part of the tracking I need to make a server-side call prior to the rendering of my index page. This is achieved by calling a specially formatted URL. Currently I'm achieving this in the following way:
url = URI.parse('https://example.tracking.url')
result = Net::HTTP.start(url.host, use_ssl: true, verify_mode: OpenSSL::SSL::VERIFY_NONE) do
|http| http.get url.request_uri, 'User-Agent' => 'MyLib v1.2'
end
The loading of my page seems to be, at times, somewhat delayed. Short of it being a Database latency issue I assume it's just that sometimes the URL takes a extra time to respond and that this is a synchronous request. What is the best way to make asynchronous requests in Rails, Threads maybe? Thanks.
Have you looked into using a delayed job or Thread.new?
I would move it to a helper method and then call Thread.new on the helper method. Personally, I like using delayed_job for handling things that may present a delay with the user interface.