I am working on a asp.net mvc site that uses facebook social widgets. Whenever I launch the debugger (ie9 is the browser) I get many error popups with: Error: '__flash__removeCallback' is undefined.
To verify that my code was not responsible I just created a brand new asp.net mvc site and hit F5.
If you navigate to this url: http://developers.facebook.com/docs/guides/web/#plugins you will see the pop-ups appearing.
When using other browsers the pop-up does not appear.
I had been using the latest ie9 beta before updating to ie9 RTM yesterday and had not run into this issue.
As you can imagine it is extremely annoying...
How can I stop those popups?
Can someone else reproduce this?
Thank you!
I can't seem to solve this either, but I can at least hide it for my users:
$('#video iframe').attr('src', '').hide();
try {
$('#video').remove();
} catch(ex) {}
The first line prevents the issue from screwing up the page; the second eats the error when jquery removes it from the DOM explicitly. In my case I was replacing the HTML of a container several parents above this tag and exposing this exception to the user until this fix.
I'm answering this as this drove me up the wall today.
It's caused by flash, usually when you haven't put a unique id on your embed object so it selects the wrong element.
The quickest (and best) way to solve this is to just:
add a UNIQUE id to your embed/object
Now this doesn't always seem to solve it, I had one site where it just would not go away no matter what elements I set the id on (I suspect it was the video player I was asked to use by the client).
This javascript code (using jQuery's on document load, replace with your favourite alternative) will get rid of it. Now this obviously won't remove the callback on certain elements. They must want to remove it for a reason, perhaps it will lead to a gradual memory leak on your site in javascript, but it's probably trivial.
this is a secondary (and non-optimal) solution
$(function () {
setTimeout(function () {
if (typeof __flash__removeCallback != "undefined") {
__flash__removeCallback = __flash__removeCallback__replace;
} else {
setTimeout(arguments.callee, 50);
}
}, 50);
});
function __flash__removeCallback__replace(instance, name) {
if(instance != null)
instance[name] = null;
}
I got the solution.
try {
ytplayer.getIframe().src='';
} catch(ex) {
}
It's been over a months since I last needed to debug the project.
Facebook has now fixed this issue. The annoying pop-up no longer shows up.
I have not changed anything.
Related
our team decided to zoom out the whole site. So they did this:
This is breaking my PW tests while clicking on the button.
I get this in the inspector:
selector resolved to visible <button id="add-to-cart-btn" data-partid="04-0001" data-…>ADD TO CART</button>
attempting click action
waiting for element to be visible, enabled and stable
element is visible, enabled and stable
scrolling into view if needed
done scrolling
element is outside of the viewport
I found this is an issue in PW https://github.com/microsoft/playwright/issues/2768
My question is: how can I bypass this in the most efficient way?
Is there a way to override a playwright function that sets the initial loading of the page and set my zoom there?
Since if I do it via JavaScript, then I have to do it every time my page reloads, and that can be really tedious and error-prone in the tests.
This is what I have now, but it is really a hack a like solution:
async removeZoomOutClassFromBodyElement() {
await this.#page.evaluate(() => {
const body = document.querySelector('body');
if (body) {
// Removes the class only if it exists on the body tag
body.classList.remove('zoom-out');
} else {
throw Error(ErrorMessage.BODY_NOT_FOUND);
}
});
}
Can you please advise what would be the best approach here?
Thanks!
I'm having an issue with paging when I delete an object using the DeleteObject method (Entity Framework). The deletion works fine, as it is supposed to, but the page number is updated to the next page. I mean, if I'm deleting a record that is on page 3 of my search results, after the deletion is completed, the page number is updated to "page 4", even though the search results still correspond to page 3!
I have checked everything I could think of, but I can't figure out what is wrong. Has anyone had this problem before? (I'm pretty new to MVC, Razor, etc).
Thank you!
Thanks for the reply, Gert Arnold and Moeri; I posted in a hurry going to a meeting and didn't add enough details.
As I went to grab the code to post here, I found the solution:
function DeleteRecord(SubscriptionID){
var URL = "#Url.Content("~")PubSub/DeleteSubscriber/" + SubscriptionID;
if(confirm("Are sure you want to delete this record?")){
$.get(URL, function (data) {
if(data=="True")
{
$("#SubscriptionContainer"+SubscriptionID).show();
$("#Subscription"+SubscriptionID).html("<b><i>Delete Successful! Refreshing list please wait........</i></b>");
window.setTimeout(function () {
GetPage($("#PageNumber").val() - 1); //Adding the "- 1" solved the issue
}, 2000);
}
});
}
}
To fix the code all I did was replacing GetPage($("#PageNumber").val(), with GetPage($("#PageNumber").val() - 1);
As title suggests I am building a mobile website with JQuery Mobile (1.3.0) and am trying to implement Google Places Autocomplete (API v3) to aid user input of location data.
The autocomplete functions correctly on desktop device, but not when used on a mobile device (I have only tested on iOS 6).
When used on mobile device the dropdown list of relevant locations do appear, but simply disappear when you press one without loading the selection on the map.
I have looked around and seen some solutions that sight the z-index of
.pac-container
as the culprit (see: http://osdir.com/ml/google-maps-js-api-v3/2012-01/msg00823.html).
I have implemented these fixes but to no avail, and I am not convinced that z-index is the problem because I can see that the selected item does change to it's :hover state/colour when pressed on mobile.
Please if anyone has suggestions I am all ears, need any more details let me know.
Saravanan's answer is a bit overkill. To fix the conflict with FastClick and PAC, add the needsclick class to both the pac-item and all its children.
$(document).on({
'DOMNodeInserted': function() {
$('.pac-item, .pac-item span', this).addClass('needsclick');
}
}, '.pac-container');
Thanks Daniel. But the solution I have given has some performance impact.
I have modifed the FastClick library little bit to accomplish that.
First I have added a param to FastClick constructor, where defaultElCls will be the elements which should not implement fastclick.
function FastClick(layer, defaultElCls) {
'use strict';
var oldOnClick, self = this;
this.defaultElCls = defaultElCls;
Then modify needsClick method:
FastClick.prototype.needsClick = function(target) {
'use strict';
var nodeName = target.nodeName.toLowerCase();
if (nodeName === 'button' || nodeName === 'input') {
// File inputs need real clicks on iOS 6 due to a browser bug (issue #68)
// Don't send a synthetic click to disabled inputs (issue #62)
if ((this.deviceIsIOS && target.type === 'file') || target.disabled) {
return true;
}
} else if (nodeName === 'label' || nodeName === 'video') {
return true;
}
return ((/\bneedsclick\b/).test(target.className) || (new RegExp(this.defaultElCls).test(target.className)));
};
Then pass pac-item to the FastClick constructor
new FastClick(document.body, "pac-item");
Hope this will be taken care by FastClick library as well :)
I've also encountered this bug, and determined fastclick to be the culprit. I was originally going to go with Devin Smith's answer, but epegzz's warning about MutationEvents being deprecated led me to MutationObservers, and since I haven't seen a fix involving them I thought I'd share my solution.
var observer_config = { attributes: false, childList: true, subTree: false, characterData: false }
var observer = new MutationObserver( function(mutations) {
var self = this;
mutations.forEach(function(mutation){
// look for the container being added to the DOM
var pac_container_added = $(mutation.addedNodes).hasClass('pac-container');
// if it is, begin observing it
if (pac_container_added){
var pac_container = mutation.addedNodes[0];
self.observe(pac_container, observer_config);
}
// look for pac-items being added (as children of pac_container)
// This will not resolve if the observer on pac-container has not been created
var pac_item_added = $(mutation.addedNodes).hasClass('pac-item');
// when pac items are added, add the needsclick class
if (pac_item_added) {
$('.pac-item, .pac-item span').addClass('needsclick')
}
});
});
observer.observe(document.body, observer_config);
It is more complex than I'd like it to be because we can't just add observer.observe('pac_container') in the top level, since its added asynchronously. Luckily, the solution for that problem is also MutationObservers.
We add another observer to pac_container when it is created. That way, it detects the pac-items being added, and when they are, we add the needsclick class.
This is my first time using MutationObservers, so feedback/improvements would be appreciated. As you can see, I used both jquery, but it should be pretty easy to pull it out.
There is a patch for fastclick that makes it work well with google places autocomplete. See This answer :)
After much hair pulling I have found the problem to be the "FastClick" library I added to my project.
As #Saravanan Shanmugam points out in this comment https://stackoverflow.com/a/16932543/1177832
FastClick seems to interfere with autocomplete. Also see above link for the workaround he has added to get the two to play nice.
I have a BrowserField in my app, which works great. It intercept NavigationRequests to links on my website which go to external sites, and brings up a new windows to display those in the regular Browser, which also works great.
The problem I have is that if a user clicks a link to say "www.google.com", my app opens that up in a new browser, but also logs it into the BrowserHistory. So if they click back, away from google, they arrive back at my app, but then if they hit back again, the BrowserHistory would land them on the same page they were on (Because going back from Google doesn't move back in the history) I've tried to find a way to edit the BrowserField's BrowserHistory, but this doesn't seem possible. Short of creating my own class for logging the browsing history, is there anything I can do?
If I didn't do a good job explaining the problem, don't hesitate for clarification.
Thanks
One possible solution to this problem would be to keep track of the last inner URL visited before the current NavigationRequest URL. You could then check to see whether the link clicked is an outside link, as you already do, and if it is call this method:
updateHistory(String url, boolean isRedirect)
with the last URL before the outside link. Using your example this should overwrite "www.google.com" with the last inner URL before the outside link was clicked.
Here is some half pseudocode/half Java to illustrate my solution:
BrowserFieldHistory history = browserField.getHistory():
String lastInnerURL = "";
if navigationRequest is an outside link {
history.updateHistory(lastInnerURL, true);
// Handle loading of outer website
} else {
lastInnerURL = navigationRequest;
// Visit inner navigation request as normal
}
http://www.blackberry.com/developers/docs/5.0.0api/net/rim/device/api/browser/field2/BrowserFieldHistory.html#updateHistory(java.lang.String, boolean)
I had a similar but a little bit different issue. Special links in html content like device:smth are used to open barcode scanner, logout etc and I wanted them not to be saved in BrowserFieldHistory. I found in WebWork source code interesting workaround for that. All that you need is throw exception at the end like below:
public void handleNavigationRequest( BrowserFieldRequest request ) throws Exception {
if scheme equals to device {
// perform logout, open barcode scanner, etc
throw new Exception(); // this exception prevent saving history
} else {
// standard behavior
}
}
I have a bit of an odd problem, and I'm struggling to track down the root cause...
I have an ASP.net MVC site, and recently one of my colleagues started using IE9, and noticed a problem with one of the pages - it wasn't updating on click of save.
I figured that this would probably be a script issue, as there is a fair bit of jQuery used on this page, and it may still be, but:
If I submit this page in Chrome (or in IE8/7/6), then I get a forms collection with 11 items in it, as I would expect. If I submit the same page in IE9, I get an extra item at the end of the collection which has an empty string as key and an empty string as the value. This causes the call to UpdateModel() to not work (but not throw an exception) - none of these values are updated in my object, and the ModelState is still showing as valid.
So far, I've only found this one page, but I'm curious if anybody might know what is causing this?
Update 04/04/2011 - Narrowed down the culprit:
I removed bits of code until this worked and narrowed it down to some code in my validation. I use the jQuery validate plugin, and had the following as a submit handler (some redaction performed on names...):
submitHandler: function (form) {
var submitForm = true;
var newValue, originalValue;
newValue= $("#newValue").val();
originalValue= $("#originalValue").val();
if (newValue!= originalValue) {
//affectedValues is an array populated at the top of the page.
if ($.inArray(originalValue, affectedValues) != -1 &&
$.inArray(newValue, affectedValues) == -1) {
submitForm = confirm("Are you sure you want to do this");
}
}
if (submitForm) {
form.submit();
}
},
Removing this from the code (which I can thankfully do, as it's a bit of legacy code), seems to make this work, my empty item in the forms collection is gone. If anybody has any idea why this might have been happening, that'd be great.
Might be worth checking all the form fields in firebug to see if you have any un-named elements? I know I got caught out by the Select behaviour in IE before.
pdate 04/04/2011 - Narrowed down the culprit:
I removed bits of code until this worked and narrowed it down to some code in my validation. I use the jQuery validate plugin, and had the following as a submit handler (some redaction performed on names...):
submitHandler: function (form) {
var submitForm = true;
var newValue, originalValue;
newValue= $("#newValue").val();
originalValue= $("#originalValue").val();
if (newValue!= originalValue) {
//affectedValues is an array populated at the top of the page.
if ($.inArray(originalValue, affectedValues) != -1 &&
$.inArray(newValue, affectedValues) == -1) {
submitForm = confirm("Are you sure you want to do this");
}
}
if (submitForm) { form.submit(); }},
Removing this from the code (which I can thankfully do, as it's a bit of legacy code), seems to make this work, my empty item in the forms collection is gone. If anybody has any idea why this might have been happening, that'd be great.
I had some problems with my MVC sites due to the caching features introduced for IE9. My work around was to disable caching in my controller by adding an attribute:
[OutputCache(NoStore = true, Duration = 0, VaryByParam = "*")]
public class FaxController : Controller
FF, Chrome, Opera sends just value of FORM elements (button, input,..) with NAME.
IE always sends elements to server, even Submit with empty name and value which causes error.
So to be sure, always name elements.