Ajax polling crashing the browser as its memory usage,cpu utilization continously increasing ? Any alternative - asp.net-mvc

I am new to the ajax polling and i implemented to fetch data continuously , But the problem i am getting is Memory usage and CPU utilization is continously keep on increasing and in the last the browser is crashing .
Here is ajax call what i am using to fetch data continuously .
$(document).ready(function () {
make_call();
function make_call() {
$.ajax({
url: "url",
accepts: "application/json",
cache: false,
success: function (result) { // Some code here },
complete: make_call
});
}
}
Is there any other alternative , or am i doing something wrong . Please provide some suggestion or solution . Thanks in advance .

Your code initializes a new request at the same moment the previous requests completes (complete being either an error or success). You likely want to have a small delay before requesting new data - with the benefit of reducing both server and client load.
$.ajax({
// ...
complete: function() {
setTimeout(make_call, 5000);
}
});
The above code waits for 5 seconds before making the next request. Tune the value to your needs of "continuous".

Related

Does frequent ajax call kill PC? How can I handle this?

Every 10 seconds, it calls this ajax to reload the number of received mails.
However, it seems that my MacBook is getting so heated as time goes by, even when I'm doing nothing but staying at the same page in my application.
How should I handle this kind of transaction?
refresh_mail_count.js
jQuery(document).ready(function () {
refreshMail();
});
function refreshMail() {
$.ajax({
url: "/messages/refresh_mail",
type: "GET",
dataType: "script",
});
}
refresh_mail.js.erb
$('#message_count').html("<%= j(render(:partial => 'layouts/message_received_count', :object => #message_count)) %>");
setTimeout(refreshMail,10000);
The CPU gets hot when it does work: the question is then - is this work justified? That is, which process(es) use the CPU, and when?
This work will not come from the network request itself, as this is an IO operation of "low CPU usage", but consider what might cause some work:
The processing the response data (excessive/slow DOM manipulation), or
The web-server itself (slow/inefficient impl.), if running local, or
The AJAX requests might be piling up - this can lead to a snowball effect1!
1 Make sure to only post a new request, 10 seconds after recieving the previous success/failure callback.

file_get_contents('YOUTUBE API') returns cached/non-updated/old information

https://gdata.youtube.com/feeds/api/users/EXAMPLE/uploads?v=2&alt=jsonc
When visiting this url direct from a browser, it will return the correct data 100% of the time. If a video has been added, it's there, if a video has been deleted, it's gone.
When getting this data through file_get_contents('https://gdata.youtube.com/feeds/api/users/EXAMPLE/uploads?v=2&alt=jsonc');
The data seems to be cached or not updated/current data...
If you continue refreshing the page, it will show/hide new videos, as well as show/hide deleted videos for about 5-10 minutes, then it will be accurate.
The same thing happens when I get data using $.getJSON(), or $.ajax()...
Shouldn't the data be the same as when visiting the url in the browser?
I'm simply trying to get the most recent video uploaded by a user "EXAMPLE".
public function ajaxUpdateVideoFeed()
{
header("Content-type: application/json");
$json = file_get_contents('https://gdata.youtube.com/feeds/api/users/EXAMPLE/uploads?v=2&alt=jsonc');
$data = json_decode($json, TRUE);
$videoId = $data['data']['items'][0]['id'];
echo json_encode($videoId);die();
}
Try appending a random number to the end of you url call. https://gdata.youtube.com/feeds/api/users/EXAMPLE/uploads?v=2&alt=jsonc&r=RAND where RAND would be some arbitrary randomly generated number each time the url is called. Not sure if this will work for you, but it may be worth a try.
I'm having a similar issue. I'm trying to retrieve current state of a specific video via $.ajax() call, and the response data appears to stay cached. If I try the url from a browser the data is updated.
$.ajax({
url: 'https://gdata.youtube.com/feeds/api/videos/YouTubeID?v=2&alt=json'
type: "post",
dataType: 'json',
cache: false,
success: function(response, textStatus, jqXHR){
var ytState = response.entry.app$control.yt$state.name;
},
error: function(jqXHR, textStatus, errorThrown){
console.log( "The following error occured: "+ textStatus + errorThrown );
},
complete: function(){
}
});
I have tried json, jsonp, cached=false, appending a time stamp, appending my developer key to url, and no success.
**EDIT
By using POST vs GET in my ajax request it seems to eliminate my similar caching issue.
EDIT EDIT
Nevermind, I suck. Using POST is producing a 400 error.

jQueryUI autocomplete won't allow me to continue typing whilst it is busy searching initial set of characters

I've got a Google Searchbar-type input field. When I type in a couple characters and wait for half a second it runs the ajax call to an external website I've set in the "source" function of the autocomplete code and once it has returned the results it returns it to the screen (like it should).
The problem is that while the ajax call is being run to fetch the results it won't allow me to continue typing in the input field until the ajax call has completed.
How can I get it to allow me to continue typing while the ajax call is being made?
Here is my jQuery function:
$('#googleSearchbar').autocomplete({
minLength: 2,
autoFocus: true,
delay: 500,
source: function (request, response) {
results = $.parseJSON($(this).callJson('post', 'http://my_external_url', {
data: request.term
}));
response(results);
},
error: function (err) {
console.error('ERROR : ' + err);
return false;
}
});
I have a hunch you are blocking the browser when making your AJAX request. This line:
results = $.parseJSON($(this).callJson('post', 'http://my_external_url', {
data: request.term
}));
Makes me think that $(this).callJson(...) is a synchronous request, which is going to lock up the entire browser for the duration of the request.
You need to make an asynchronous request and call the response function when that request completes. This should stop the browser from locking up.

IE memory leak and eval with jQuery

I've created a page which needs to have its elements updated according what's happening with the data in our database. I'd like to know what do you think about this approach using eval, I know it's risky but in my case it was the fastest way.
$('.updatable').each(function () {
var data;
data = 'ViewObjectId=' + $(this).attr('objectid');
$.ajax({
async: true,
url: '/Ajax/GetUpdatedViewObjectDataHandler.ashx',
data: data,
type: 'POST',
timeout: 10000,
success: function (data) {
$.each(data, function (index, value) {
eval(value);
});
}
});
Now the issue I have is when the page is loaded, for each 10 seconds the page is updated, until here it's perfect.
After each round of updates, my Internet Explorer steal some memory, and it gets the whole machine memory after some hours, terrific.
What would you do in this case? Some other update approach is recommended? Or even, is there something you think I could do to avoid this memory leak?
Found the answer here: Simple jQuery Ajax call leaks memory in Internet Explorer
THE SOLUTION:
var request = $.ajax({ .... });
request.onreadystatechange = null;
request.abort = null;
request = null;
JQuery doesn't do that and the memory never releases.
jQuery version 1.4.2.
Now it's working like a charm.

jQuery UI AutoComplete Plugin - Questions

I have an ASP.NET MVC 3 Web Application (Razor), and a particular View with the jQuery UI AutoComplete plugin (v1.8).
Here's the setup i currently have:
$('#query').autocomplete({
source: function (request, response) {
$.ajax({
url: "/Search/FindLocations",
type: "POST",
dataType: "json",
data: { searchText: request.term },
success: function (data) {
response($.map(data, function (item) {
return { name: item.id, value: item.name, type: item.type }
}))
},
error: function (xmlHttpRequest, textStatus, errorThrown) {
// don't know what i should do here...
}
})
},
select: function (event, ui) {
$.get('/Search/RenderLocation', { id: ui.item.name }, function (data) {
$('#location-info').html(data);
});
},
delay: 300, minLength: 3
});
The AutoComplete returns locations in the world, basically identical to Google Maps auto complete.
Here are my questions:
1) What are the recommended settings for delay and minLength? Leave as default?
2) I thought about putting [OutputCache] on the Controller action, but i looks as though the plugin automatically does caching? How does this work? Does it store the results in a cookie? If so when does it expire? Is any additional caching recommended?
3) I've noticed if i type something, and whilst the AJAX request is fired off, if i type something else, the dialog shows the first result momentarily, then the second result. I can understand why, but it's confusing to the user (given the AJAX request can take 1-2 seconds) but i'm thinking about using async: false in the $.ajax options to prevent multiple requests - is this bad design/UX?
4) Can you recommend any other changes on my above settings for improving performance/usability?
1) It really depends on your usage and your data.
2) You should use [OutputCache]. If there's any caching happening on the plugin, it's only going to be for each user, if you use caching at the controller action level, it'll cache one for all users. (again, this might actually be bad depending on your usage, but usually this is good to do)
3) This questions kind of hard too because of the lack of context. If ajax requests are 1-2 seconds and there's no way to make this shorter, you really should be a pretty big delay in so that users aren't sending off many requests while typing out a long word (if they type slow).
4) sounds like you need to look at your /search/FindLocations method and see where you can do caching or pref improvements. Give us a look at your code in here and I can try to suggest more.

Resources