I'm new in Rails 4. I want to delay HTTP response.
I thought the 'respond_to' method is enable to respond to HTTP request.
However, When I removed the 'respond_to' method, The rails controller automatically responded to request.
Below is my step for this.
Send HTTP Request in view
[index.html.erb]
<script>
var ready = function() {
alert('this is function');
}
var aquery = function() {
$.ajax({
type : "POST",
url : "app/subaction",
});
}
$(document).on('ready page:load', aquery);
</script>
Receive HTTP Request in Controller
class AppController < ApplicationController
def subaction
(Nothing here...)
end
end
subaction.js.erb
$('#div_id').empty().append("Complete response ...");
In this step, response was executed automatically although there is not "respond_to" method.
Can I delay the response ??? Can you explain request-response operation in rails ?
Thank you...
The main reason Rails renders a response by default is because the documentation says so. Rails follows a philosophy of 'convention over configuration', which means it tries to help you out in ways that keep the code you write to a minimum. Most of the time you want your controller actions to render a view, so Rails does that for you.
You can use a number of techniques to delay responses. The simplest is to use Ruby's sleep method to introduce a delay:
class AppController < ApplicationController
def subaction
sleep 3 # Wait for 3 seconds before responding
end
end
This might be useful when testing how your app behaves over a slow internet connection, but should probably be avoided in production code. Fast apps make happy users.
You could also use the ActionController::Live module, introduced in Rails 4.0.2. It allows you to stream data to the client, but consuming the stream can be tricky. jQuery waits for the response to complete before firing callbacks, so you'll have to use something else to process the stream.
This is similar to Websockets, an emerging streaming standard. There's some support available for websockets in Rails, but it's not universally supported by browsers.
Another alternative is to switch the delay to the frontend. You can use JavaScript's setTimeout or setInterval to call some code after a delay:
setTimeout(function() {
alert("I run once, after 4 seconds");
}, 4000);
setInterval(function() {
alert("I run every two seconds");
}, 2000);
If you're trying to check for updates, you might be tempted to use setInterval, but you may find it more flexible to use setTimeout to schedule a one-off check of the server. You can then include a time delay from the server which specifies how long to wait before asking again.
Related
I am working on offline support in my PWA app. I am using workbox for that. This is my current code:
const addToFormPlugin = new workbox.backgroundSync.Plugin('addToForm');
workbox.routing.registerRoute(
RegExp('MY_PATH'),
workbox.strategies.networkOnly({
plugins: [addToFormPlugin]
}),
'POST'
);
The code seems to works fine on my computer. However, once I run the app on the phone it takes ages to upload requests stored in IndexedDB. I know that it happens on the SYNC but it seems to take at least 5 minutes. This is not exactly what I need. I wonder if there is an option to access the IndexDB and send all the requests "manually" on click. Another way would be to check if the device is online. Here is how requests are stored:
If you need to force this, the cleanest approach would be to use the workbox.backgroundSync.Queue class (instead of workbox.backgroundSync.Plugin) directly.
The Plugin class takes care of setting up a fetchDidFail callback for you, so if you use the Queue class, you need to do that yourself:
const queue = new workbox.backgroundSync.Queue('addToForm');
workbox.routing.registerRoute(
RegExp('MY_PATH'),
workbox.strategies.networkOnly({
plugins: [{
fetchDidFail: async ({request}) => {
await queue.addRequest(request);
},
}],
}),
'POST'
);
You could then call queue.replayRequests() to trigger the replay, e.g., as a result of a message event:
self.addEventListener('message', (event) => {
if (event.data === 'replayRequests') {
queue.replayRequests();
}
});
But... that all being said, I think your best bet is just to let the browser "do its thing" and figure out when the right time is to replay the queued requests. That will end up being more battery-friendly for mobile devices.
If you're unhappy with the interval that the browser waits before firing a sync event, then the best course of action could be to open a bug against the browser—whether it's Chrome (as appears in your screenshot) or another browser.
I have a React component that mimics the "link preview" feature that most modern social media sites have. You type in a link and it fetches the image, title, etc...
I do this by having the React component make an AJAX call back to my server to fetch the URL preview data.
While it's fetching I show an intermediate "loading" state (i.e. some loading icon or spinning wheel)
The relevant React snippet looks like
this.setState({ isLoadingAttachment: true })
return $.ajax({
type: "GET",
url: some_url,
dataType: "json",
contentType: "application/json",
}).success(function(response){
// Succesful! Do Success stuff
component.setState({ isLoadingAttachment: false })
}).error(function(response) {
// Uh oh! Handle failure stuff
component.setState({ isLoadingAttachment: false })
});
Note how the isLoadingAttachment state variable is only valid for a brief second while the server is doing the fetching. Both the success and error scenarios immediately disable it.
I'd like to test some functionality during my "loading" state with my Capybara feature specs. I've mocked all the web calls and the data to be returned by the server, but it all happens so quickly that it passes through the "loading" state before I can even run any expect().. statement on it. I also purposely don't call wait_for_ajax so the page will go ahead without waiting for the ajax, but it's still too fast.
Lastly I also tried purposefully delaying the server call by 1.0 second, but that didn't work either. I assume because the whole thing is single threaded somehow?
# `foo` is an arbitrary method called during the server-side execution
allow_any_instance_of(MyController).
to receive(:foo) { sleep(1.0) }.and_call_original
Any thoughts on how I could do this?
Thanks!
Capybara starts up the app server in a different thread than the tests, however if you're using the default Capybara.server setting you may have issues with your app calling back to itself since it uses webrick by default. Instead you should specify Capybara.server = :puma. Beyond that, mocking responses is generally a bad idea in feature specs (which are generally meant to be end-to-end tests) since it means you're not actually testing your apps code the way it would run in production anymore. A better solution is to use something like puffing-billy - https://github.com/oesmith/puffing-billy - to mock web responses outside of your apps code which would allow you to do something like
proxy.stub('https://example.com/proc/').and_return(Proc.new { |params, headers, body|
sleep 2
{ :text => "Your results"}
})
How do I send update notifications to clients using server sent event?
What I want to accomplish is that when a client ajax calls an action, the server then would send relevant data to all connecting clients through my stream action.
I'm trying to know if this would be possible without websockets or pub/subs.
From what I can gather, you're looking for a generalized approach, rather than specific code?
--
SSE's
Server Sent Events are an HTML5 technology, meaning that if you do it correctly, it shouldn't matter whether you use Rails or another framework -- they should just work
One drawback to SSE's is they act very similar to Ajax long-polling, meaning they send constant "pings" / requests to your server, relaying back any response they find. And they'll still use the pub/sub pattern too
-
Simply, SSE's are when you have a Javascript "event listener", which will listen to an "endpoint" (URL). The endpoint, in the case of Rails, will be a controller#action, from which you can send the relevant text/event-stream updates, which is what ActionController::Live::SSE is there to do
--
Setup
#config/routes.rb
resources :your_controller do
collection do
get :endpoint
end
end
#app/assets/javascripts/application.js
var source = new EventSource('your_controller/endpoint');
source.addEventListener('message', function(e) {
console.log(e.data);
}, false);
#app/controllers/your_controller.rb
Class YourController < ActionController::Base
include ActionController::Live
def endpoint
response.headers['Content-Type'] = 'text/event-stream'
sse = SSE.new(response.stream, retry: 300, event: "event-name")
sse.write({ name: 'John'})
ensure
sse.close
end
end
This will send the relevant updates for you every time
I am building a Rails/Javascript application which is supposed to support real-time notifications. JavaScript is:
var chat;
$(document).ready(function(){
chat = $('#chat');
chat.append('mmm');
(function poll(){
$.ajax({ url: "http://localhost:3000/get_messages", success: function(data){
//Update your dashboard gauge
chat.append(data);
}, dataType: "json", complete: poll, timeout: 30000 });
})();
});
The route:
match 'get_messages', to: 'real_time_notifs#get_messages', :via => :get
Here is the controller's method:
def get_messages
# sleep ??? will it stop the whole application?
render :json => ['message body']
end
I want that JavaScript will receive an answer only if there is something to display (for example, new message appeared in database table) without making a whole application to stop. Could you suggest how to organize get_messages method?
I need the solution which will not block the rest of application while waiting.
There are a number of ways to achieve this
Although I don't have huge experience, you should be thinking about it from another perspective (not just sending Ajax poll requests):
SSE's (Server Sent Events)
I'd recommend you use SSE's
The sent updates are not in the usual HTTP scope (uses its own mime type -- text/event-stream), which I believe means they are completely asynchronous (doesn't matter what you're doing in the app etc)
SSE's are basically done through the front-end by deploying a JS listener. This polls the server for any updates, but unlike Ajax, only listens for the text/event-stream mime):
var source = new EventSource("demo_sse.php");
source.onmessage = function(event) {
alert(event.data);
};
The efficient part is that you can then update this with ActionController::Live::SSE in Rails. I don't have any experience with this, but it basically allows you to send updates via the text/event-stream mime type
WebSockets
Websockets basically open a perpetual connection with your server, allowing you to receive content above the normal HTTP scope
My experience does not extend to "native" websockets (we've successfully used Pusher, and are working on our own websock implementation); but I can say that it's basically a more in-depth version of SSE's
You'll have to use JS to authenticate the client-server connection, and once connected, the browser will listen for updates. I'm not sure about the mime-type for this, but reading up on ActionController::Live will give you some insight into how it works
Either one of these methods will do as you need (only send / receive updates as they are available)
Every 10 seconds, it calls this ajax to reload the number of received mails.
However, it seems that my MacBook is getting so heated as time goes by, even when I'm doing nothing but staying at the same page in my application.
How should I handle this kind of transaction?
refresh_mail_count.js
jQuery(document).ready(function () {
refreshMail();
});
function refreshMail() {
$.ajax({
url: "/messages/refresh_mail",
type: "GET",
dataType: "script",
});
}
refresh_mail.js.erb
$('#message_count').html("<%= j(render(:partial => 'layouts/message_received_count', :object => #message_count)) %>");
setTimeout(refreshMail,10000);
The CPU gets hot when it does work: the question is then - is this work justified? That is, which process(es) use the CPU, and when?
This work will not come from the network request itself, as this is an IO operation of "low CPU usage", but consider what might cause some work:
The processing the response data (excessive/slow DOM manipulation), or
The web-server itself (slow/inefficient impl.), if running local, or
The AJAX requests might be piling up - this can lead to a snowball effect1!
1 Make sure to only post a new request, 10 seconds after recieving the previous success/failure callback.