I've created a page which needs to have its elements updated according what's happening with the data in our database. I'd like to know what do you think about this approach using eval, I know it's risky but in my case it was the fastest way.
$('.updatable').each(function () {
var data;
data = 'ViewObjectId=' + $(this).attr('objectid');
$.ajax({
async: true,
url: '/Ajax/GetUpdatedViewObjectDataHandler.ashx',
data: data,
type: 'POST',
timeout: 10000,
success: function (data) {
$.each(data, function (index, value) {
eval(value);
});
}
});
Now the issue I have is when the page is loaded, for each 10 seconds the page is updated, until here it's perfect.
After each round of updates, my Internet Explorer steal some memory, and it gets the whole machine memory after some hours, terrific.
What would you do in this case? Some other update approach is recommended? Or even, is there something you think I could do to avoid this memory leak?
Found the answer here: Simple jQuery Ajax call leaks memory in Internet Explorer
THE SOLUTION:
var request = $.ajax({ .... });
request.onreadystatechange = null;
request.abort = null;
request = null;
JQuery doesn't do that and the memory never releases.
jQuery version 1.4.2.
Now it's working like a charm.
Related
My fellow coder and I are having issues sending some image uploads to the MVC controller via a multipart FormData() element. What we currently have set up has been tested on a windows desktop using the latest versions of Chrome, IE, Edge, Firefox, and Opera without any issues.
Below is the JavaScript and ajax we are using. We have multiple file inputs that accept multiple image files. The loop below goes through and individually appends each file from each input.
var index = 0;
var data = new FormData();
data.append("datalist", JSON.stringify(formInfo)); //list of objects
data.append("form", JSON.stringify(headerInfo)); //object
var images = document.getElementsByClassName("ImageUpload");
for (var i = 0; i < images.length; i++){
var files = $(images[i]).prop("files");
if (files != null && files.length > 0){
for (var e = 0; e < files.length; e++){
//append the image with a filename
data.append("image" + ++index, files[e]);
}
}
}
UpdateData(data);
function UpdateData(datalist) {
$.ajax({
url: '#Url.Action("UpdateData", "Home")',
type: "POST",
cache: false,
contentType: false,
processData: false,
datatype: "JSON",
data: datalist,
success: function (result) {
//do stuff
}
});
}
In the controller, we are getting the information through Request.Form and Request.Files. As I mentioned, we are not having any problems other than on the iPad. When using the iPad, all it does is hang there, and does not even hit the controller. We do not have access to a Mac machine, so we can't hook up the iPad to use developer tools. The iPad has the latest version of IOS and has all app updates installed. We have spent a good portion of the afternoon trying various methods we found online, and even gave the google chrome app a whirl. But alas, we have been left to scratch our heads. Any help or information would be greatly appreciated.
Thanks in advance.
I'm using turbolinks to simulate a "single-page" app. The app is acceptably quick once pages are cached, however the first clicks are too slow. Is there a way to pre-fill the transition cache by loading some pages in the background?
I'm going to be looking into hacking into the page cache, but wondering if anyone has done this before.
(If this seems unusual, well, just trust that I'm doing this for good reason. Turbolinks gets nearly the performance of a far more complex implementation and overall I'm quite happy with it.)
UPDATE: So it seems like this SHOULD be relatively easy by simply adding entries to the pageCache of Turbolinks, something like:
$.ajax({
url: url,
dataType: 'html',
success: function(data) {
var dom = $(data);
Turbolinks.pageCache[url] = {
...
}
}
});
however it doesn't seem possible to construct a body element in javascript, which is required. Without out that, it doesn't seem like I can't construct the object that is stored in the cache without the browser first rendering it.
Any ideas beyond hacking more into Turbolinks?
UPDATE 2: There was the further problem that pageCache is hidden by a closure, so hacking Turbolinks is necessary. I have a solution that I'm testing that leverages iFrames - seems to be working.
First Hack Turbolinks to allow access to pageCache, the end of your turbolinks.js.coffee should look like this.
#Turbolinks = {
visit,
pagesCached,
pageCache,
enableTransitionCache,
enableProgressBar,
allowLinkExtensions: Link.allowExtensions,
supported: browserSupportsTurbolinks,
EVENTS: clone(EVENTS)
}
Then implement a fetch function. This is what you were thinking about, we can use DOMParser to convert string into DOM object.
function preFetch(url) {
$.ajax({
url: url,
dataType: 'html'
}).done(function(data) {
var key = App.Helper.getCurrentDomain() + url;
var parser = new DOMParser();
var doc = parser.parseFromString(data, "text/html");
Turbolinks.pageCache[key] = {
url: url,
body: doc.body,
title: doc.title,
positionY: window.pageYOffset,
positionX: window.pageXOffset,
cachedAt: new Date().getTime(),
transitionCacheDisabled: doc.querySelector('[data-no-transition-cache]') != null
};
});
};
Usage:
$(function() {
preFetch('/link1'); // fetch link1 into transition cache
preFetch('/link2'); // fetch link2 into transition cache
});
From the index page, a user clicks a navigation link, the data attribute is passed via ajax, the data is retrieved from the server but the content is not being updated on the new page.
Been stuck for hours, really appreciate any help!
js
$('a.navLink').on('click', function() {
var cat = $(this).data("cat");
console.log(cat);
$.ajax({
url: 'scripts/categoryGet.php',
type: 'POST',
dataType: "json",
data: {'cat': cat},
success: function(data) {
var title = data[0][0],
description = data[0][1];
console.log(title);
$('#categoryTitle').html(title);
$('#categoryTitle').trigger("refresh");
$('#categoryDescription').html(description);
$('#categoryDescription').trigger("refresh");
}
});
});
Im getting the correct responses back on both console logs, so I know the works, but neither divs categoryTitle or categoryDescription are being updated. I've tried .trigger('refresh'), .trigger('updatelayout') but no luck!
This was not intended to be an answer (but I can't comment yet.. (weird SO rules)
You should specify in the question description that the above code IS working, that your problem occurs WHEN your playing back and forth on that page/code aka, using the JQM ajax navigation.
From what I understood in the above comment, you're probably "stacking" the ajax function every time you return to the page, thus getting weird results, if nothing at all.
Is your example code wrapped into something ? If not, (assuming you use JQM v1.4) you should consider wrapping it into $( 'body' ).on( 'pagecontainercreate', function( event, ui ) {... which I'm trying to figure out myself how to best play with..
Simple solution to prevent stacking the ajax definition would be to create/use a control var, here is a way to do so:
var navLinkCatchClick = {
loaded: false,
launchAjax: function(){
if ( !this.loaded ){
this.ajaxCall();
}
},
ajaxCall: function(){
// paste you example code here..
this.loaded = true;
}
}
navLinkCatchClick.launchAjax();
I am new to the ajax polling and i implemented to fetch data continuously , But the problem i am getting is Memory usage and CPU utilization is continously keep on increasing and in the last the browser is crashing .
Here is ajax call what i am using to fetch data continuously .
$(document).ready(function () {
make_call();
function make_call() {
$.ajax({
url: "url",
accepts: "application/json",
cache: false,
success: function (result) { // Some code here },
complete: make_call
});
}
}
Is there any other alternative , or am i doing something wrong . Please provide some suggestion or solution . Thanks in advance .
Your code initializes a new request at the same moment the previous requests completes (complete being either an error or success). You likely want to have a small delay before requesting new data - with the benefit of reducing both server and client load.
$.ajax({
// ...
complete: function() {
setTimeout(make_call, 5000);
}
});
The above code waits for 5 seconds before making the next request. Tune the value to your needs of "continuous".
I have an ASP.NET MVC 3 Web Application (Razor), and a particular View with the jQuery UI AutoComplete plugin (v1.8).
Here's the setup i currently have:
$('#query').autocomplete({
source: function (request, response) {
$.ajax({
url: "/Search/FindLocations",
type: "POST",
dataType: "json",
data: { searchText: request.term },
success: function (data) {
response($.map(data, function (item) {
return { name: item.id, value: item.name, type: item.type }
}))
},
error: function (xmlHttpRequest, textStatus, errorThrown) {
// don't know what i should do here...
}
})
},
select: function (event, ui) {
$.get('/Search/RenderLocation', { id: ui.item.name }, function (data) {
$('#location-info').html(data);
});
},
delay: 300, minLength: 3
});
The AutoComplete returns locations in the world, basically identical to Google Maps auto complete.
Here are my questions:
1) What are the recommended settings for delay and minLength? Leave as default?
2) I thought about putting [OutputCache] on the Controller action, but i looks as though the plugin automatically does caching? How does this work? Does it store the results in a cookie? If so when does it expire? Is any additional caching recommended?
3) I've noticed if i type something, and whilst the AJAX request is fired off, if i type something else, the dialog shows the first result momentarily, then the second result. I can understand why, but it's confusing to the user (given the AJAX request can take 1-2 seconds) but i'm thinking about using async: false in the $.ajax options to prevent multiple requests - is this bad design/UX?
4) Can you recommend any other changes on my above settings for improving performance/usability?
1) It really depends on your usage and your data.
2) You should use [OutputCache]. If there's any caching happening on the plugin, it's only going to be for each user, if you use caching at the controller action level, it'll cache one for all users. (again, this might actually be bad depending on your usage, but usually this is good to do)
3) This questions kind of hard too because of the lack of context. If ajax requests are 1-2 seconds and there's no way to make this shorter, you really should be a pretty big delay in so that users aren't sending off many requests while typing out a long word (if they type slow).
4) sounds like you need to look at your /search/FindLocations method and see where you can do caching or pref improvements. Give us a look at your code in here and I can try to suggest more.