Does frequent ajax call kill PC? How can I handle this? - ruby-on-rails

Every 10 seconds, it calls this ajax to reload the number of received mails.
However, it seems that my MacBook is getting so heated as time goes by, even when I'm doing nothing but staying at the same page in my application.
How should I handle this kind of transaction?
refresh_mail_count.js
jQuery(document).ready(function () {
refreshMail();
});
function refreshMail() {
$.ajax({
url: "/messages/refresh_mail",
type: "GET",
dataType: "script",
});
}
refresh_mail.js.erb
$('#message_count').html("<%= j(render(:partial => 'layouts/message_received_count', :object => #message_count)) %>");
setTimeout(refreshMail,10000);

The CPU gets hot when it does work: the question is then - is this work justified? That is, which process(es) use the CPU, and when?
This work will not come from the network request itself, as this is an IO operation of "low CPU usage", but consider what might cause some work:
The processing the response data (excessive/slow DOM manipulation), or
The web-server itself (slow/inefficient impl.), if running local, or
The AJAX requests might be piling up - this can lead to a snowball effect1!
1 Make sure to only post a new request, 10 seconds after recieving the previous success/failure callback.

Related

Long time waiting Request to Service Worker

I have noticed that the time waiting for service workers to respond with items from the cache is not as fast as you would expect it to be. I have seen the same wait times with both sw-precache and a custom written service worker.
What are the possible causes for this wait time time and how could I reduce it?
My fetch event on the custom service worker looks like:
self.addEventListener('fetch', function(event) {
event.respondWith(
caches.match(event.request).then(function(response) {
if (response) {
return response;
}
return fetch(event.request);
})
);
});
Do you have 'Update on reload' checked within your Chrome Dev Tools under the Application -> Service Worker tab?
If so then I think this may be the problem as it'll be re-running all your Service Worker code, which can be quite a lot of code when using sw-precache, on each reload.
Event though I can't answer the possible causes for this strange wait time, I do know how to reduce it.
We are able to intercept a fetch event in service worker with event.respondWith(). Somehow, in my case, when my page needs to load a vendor's javascript via script tag with my service worker defaults to intercept every fetch event to perform cache-then-network (for assets) or network-only (for data fetching) like this:
if (shouldLoadOfflinePage) {
fetchEvent.respondWith(cacheManager.fromCache(new Request(offlinePage)));
} else if (shouldFromCache) {
fetchEvent.respondWith(cacheManager.fromCache(fetchEvent.request));
} else {
fetchEvent.respondWith(fetch(fetchEvent.request));
}
The last block intercepts a network-only request which is pretty unnecessary to do. This unnecessary block somehow causes a blocking load (but I don't know what blocks it):
long request to serviceworker wait time: ~400ms
So I decided not to intercept unnecessary fetch-interception (of course by removing the last block):
if (shouldLoadOfflinePage) {
fetchEvent.respondWith(cacheManager.fromCache(new Request(offlinePage)));
} else if (shouldFromCache) {
fetchEvent.respondWith(cacheManager.fromCache(fetchEvent.request));
}
Then my page needs only 16ms to load the aforementioned file.
Hope this will help
clear out the indexedDB will help to reduce the time it takes to "Request to Service Worker". You could delete it by js:
indexedDB.deleteDatabase(location.host);
or do it manually:
/Users/[USERNAME]/Library/Application
Support/Google/Chrome/Default/IndexedDB/

How to purposely delay an AJAX response while testing with Capybara?

I have a React component that mimics the "link preview" feature that most modern social media sites have. You type in a link and it fetches the image, title, etc...
I do this by having the React component make an AJAX call back to my server to fetch the URL preview data.
While it's fetching I show an intermediate "loading" state (i.e. some loading icon or spinning wheel)
The relevant React snippet looks like
this.setState({ isLoadingAttachment: true })
return $.ajax({
type: "GET",
url: some_url,
dataType: "json",
contentType: "application/json",
}).success(function(response){
// Succesful! Do Success stuff
component.setState({ isLoadingAttachment: false })
}).error(function(response) {
// Uh oh! Handle failure stuff
component.setState({ isLoadingAttachment: false })
});
Note how the isLoadingAttachment state variable is only valid for a brief second while the server is doing the fetching. Both the success and error scenarios immediately disable it.
I'd like to test some functionality during my "loading" state with my Capybara feature specs. I've mocked all the web calls and the data to be returned by the server, but it all happens so quickly that it passes through the "loading" state before I can even run any expect().. statement on it. I also purposely don't call wait_for_ajax so the page will go ahead without waiting for the ajax, but it's still too fast.
Lastly I also tried purposefully delaying the server call by 1.0 second, but that didn't work either. I assume because the whole thing is single threaded somehow?
# `foo` is an arbitrary method called during the server-side execution
allow_any_instance_of(MyController).
to receive(:foo) { sleep(1.0) }.and_call_original
Any thoughts on how I could do this?
Thanks!
Capybara starts up the app server in a different thread than the tests, however if you're using the default Capybara.server setting you may have issues with your app calling back to itself since it uses webrick by default. Instead you should specify Capybara.server = :puma. Beyond that, mocking responses is generally a bad idea in feature specs (which are generally meant to be end-to-end tests) since it means you're not actually testing your apps code the way it would run in production anymore. A better solution is to use something like puffing-billy - https://github.com/oesmith/puffing-billy - to mock web responses outside of your apps code which would allow you to do something like
proxy.stub('https://example.com/proc/').and_return(Proc.new { |params, headers, body|
sleep 2
{ :text => "Your results"}
})

Can I delay HTTP response in rails?

I'm new in Rails 4. I want to delay HTTP response.
I thought the 'respond_to' method is enable to respond to HTTP request.
However, When I removed the 'respond_to' method, The rails controller automatically responded to request.
Below is my step for this.
Send HTTP Request in view
[index.html.erb]
<script>
var ready = function() {
alert('this is function');
}
var aquery = function() {
$.ajax({
type : "POST",
url : "app/subaction",
});
}
$(document).on('ready page:load', aquery);
</script>
Receive HTTP Request in Controller
class AppController < ApplicationController
def subaction
(Nothing here...)
end
end
subaction.js.erb
$('#div_id').empty().append("Complete response ...");
In this step, response was executed automatically although there is not "respond_to" method.
Can I delay the response ??? Can you explain request-response operation in rails ?
Thank you...
The main reason Rails renders a response by default is because the documentation says so. Rails follows a philosophy of 'convention over configuration', which means it tries to help you out in ways that keep the code you write to a minimum. Most of the time you want your controller actions to render a view, so Rails does that for you.
You can use a number of techniques to delay responses. The simplest is to use Ruby's sleep method to introduce a delay:
class AppController < ApplicationController
def subaction
sleep 3 # Wait for 3 seconds before responding
end
end
This might be useful when testing how your app behaves over a slow internet connection, but should probably be avoided in production code. Fast apps make happy users.
You could also use the ActionController::Live module, introduced in Rails 4.0.2. It allows you to stream data to the client, but consuming the stream can be tricky. jQuery waits for the response to complete before firing callbacks, so you'll have to use something else to process the stream.
This is similar to Websockets, an emerging streaming standard. There's some support available for websockets in Rails, but it's not universally supported by browsers.
Another alternative is to switch the delay to the frontend. You can use JavaScript's setTimeout or setInterval to call some code after a delay:
setTimeout(function() {
alert("I run once, after 4 seconds");
}, 4000);
setInterval(function() {
alert("I run every two seconds");
}, 2000);
If you're trying to check for updates, you might be tempted to use setInterval, but you may find it more flexible to use setTimeout to schedule a one-off check of the server. You can then include a time delay from the server which specifies how long to wait before asking again.

Rails: sleep until there is data to respond with (streaming + multithreading)

I am building a Rails/Javascript application which is supposed to support real-time notifications. JavaScript is:
var chat;
$(document).ready(function(){
chat = $('#chat');
chat.append('mmm');
(function poll(){
$.ajax({ url: "http://localhost:3000/get_messages", success: function(data){
//Update your dashboard gauge
chat.append(data);
}, dataType: "json", complete: poll, timeout: 30000 });
})();
});
The route:
match 'get_messages', to: 'real_time_notifs#get_messages', :via => :get
Here is the controller's method:
def get_messages
# sleep ??? will it stop the whole application?
render :json => ['message body']
end
I want that JavaScript will receive an answer only if there is something to display (for example, new message appeared in database table) without making a whole application to stop. Could you suggest how to organize get_messages method?
I need the solution which will not block the rest of application while waiting.
There are a number of ways to achieve this
Although I don't have huge experience, you should be thinking about it from another perspective (not just sending Ajax poll requests):
SSE's (Server Sent Events)
I'd recommend you use SSE's
The sent updates are not in the usual HTTP scope (uses its own mime type -- text/event-stream), which I believe means they are completely asynchronous (doesn't matter what you're doing in the app etc)
SSE's are basically done through the front-end by deploying a JS listener. This polls the server for any updates, but unlike Ajax, only listens for the text/event-stream mime):
var source = new EventSource("demo_sse.php");
source.onmessage = function(event) {
alert(event.data);
};
The efficient part is that you can then update this with ActionController::Live::SSE in Rails. I don't have any experience with this, but it basically allows you to send updates via the text/event-stream mime type
WebSockets
Websockets basically open a perpetual connection with your server, allowing you to receive content above the normal HTTP scope
My experience does not extend to "native" websockets (we've successfully used Pusher, and are working on our own websock implementation); but I can say that it's basically a more in-depth version of SSE's
You'll have to use JS to authenticate the client-server connection, and once connected, the browser will listen for updates. I'm not sure about the mime-type for this, but reading up on ActionController::Live will give you some insight into how it works
Either one of these methods will do as you need (only send / receive updates as they are available)

Ajax polling crashing the browser as its memory usage,cpu utilization continously increasing ? Any alternative

I am new to the ajax polling and i implemented to fetch data continuously , But the problem i am getting is Memory usage and CPU utilization is continously keep on increasing and in the last the browser is crashing .
Here is ajax call what i am using to fetch data continuously .
$(document).ready(function () {
make_call();
function make_call() {
$.ajax({
url: "url",
accepts: "application/json",
cache: false,
success: function (result) { // Some code here },
complete: make_call
});
}
}
Is there any other alternative , or am i doing something wrong . Please provide some suggestion or solution . Thanks in advance .
Your code initializes a new request at the same moment the previous requests completes (complete being either an error or success). You likely want to have a small delay before requesting new data - with the benefit of reducing both server and client load.
$.ajax({
// ...
complete: function() {
setTimeout(make_call, 5000);
}
});
The above code waits for 5 seconds before making the next request. Tune the value to your needs of "continuous".

Resources