My app is nearly completed, but there's one bug which I have to get sorted before release. The app uses Cordova 3.4 and Sencha to build a "native" app for iOS and Android (the bug only relates to iOS)
Basically, when the picker value is changed, unless the user is quick enough in how they click Done, it reverts to the previous value - hard to explain! Here is a video showing the bug in action.
As mentioned before, this is only a problem on iOS (Android is fine). It is also worth noting that when there are two value options in other pickers in the app this bug does not exist. For example, the picker for time (hours & minutes) and date (day & month) do not have this bug - only single value pickers have the issue.
Any ideas?
I have just had to fix this issue within our product, and boy debugging on the iPhone is a right pain when you only have a Windows desktop!
Essentially what seemed to be happening was that when a slot's selection changed, the internal selectedIndex property was being updated, however the _value was not - and it seems that it's the _value that is being consulted.
I created a new slot class as follows, that overrides doItemTap to ensure that value is set appropriately (me._value = me.getValue(true);):
Ext.define('Ext.ux.FixedSlot', {
extend: 'Ext.picker.Slot',
xtype : 'fixedslot',
doItemTap: function(list, index, item, e, event) {
var me = this;
me.selectedIndex = index;
me.selectedNode = item;
me._value = me.getValue(true);
me.scrollToItem(item, true);
}
});
Then in my picker definition config (we have a class defined as a subclass of field.Select), I instructed it to use my new slot type (defaultType: 'fixedslot'):
Ext.define('Ext.ux.MyFixedPicker', {
extend: 'Ext.field.Select',
config : {
defaultPhonePickerConfig : { defaultType: 'fixedslot' }
}
});
I'm hoping that helps you avoid some of the pain of my last six hours! I still can't explain exactly why/where in the Sencha Touch source that's important, but for right now it appears to fix the problem and meet our packaging deadline!
Related
I'm working on a UWP app that hosts a WebView which runs in a separate process.
var webView = new Windows.UI.Xaml.Controls.WebView(WebViewExecutionMode.SeparateProcess)
This results in a behavior that if the WebView has the focus, the containing app can't regain the focus by itself by simply trying to focus on a UI element.
The app supports keyboard shortcuts which may result in different elements getting the focus, but it's not working correctly when the focus is captured by the WebView. The target element seems to be getting the focus but it seems as if the process itself is not activated (as the real focus resides in a different process I suppose...).
I'm currently trying to activate the app programmatically through protocol registration in an attempt to regain focus.
I added a declaration in the app manifest for a custom protocol mycustomprotocol coupled with the following activation overload
protected override void OnActivated(IActivatedEventArgs args)
{
if (eventArgs.Uri.Scheme == "mycustomprotocol")
{ }
}
And the following code to invoke the activation:
var result = await Windows.System.Launcher.LaunchUriAsync(new Uri("mycustomprotocol:"));
Seems to be working only on some computers, on others (not while debugging the app, only when executed unattached) instead of regaining focus the app's taskbar icon just flashes orange.
I've created a sample project showing the problem and the semi working solution here
Any insight on any of this would be great.
I can reproduce your issue. I found that when we switch the focus with the mouse, the focus can be transferred to the TextBlock. So you could solve this question through simulating mouse input.
Please use the following code to instead FocusTarget.Focus(FocusState.Programmatic).
As follows:
InputInjector inputInjector = InputInjector.TryCreate();
var infoDown = new InjectedInputMouseInfo();
// adjust your mouse position to the textbox through changing infoDown.DeltaX,infoDown.DeltaY
infoDown.DeltaX = 10; //change
infoDown.DeltaY = -150; //change
infoDown.MouseOptions = InjectedInputMouseOptions.LeftDown;
var infoUp = new InjectedInputMouseInfo();
infoUp.DeltaX = 0;
infoUp.DeltaY = 0;
infoUp.MouseOptions = InjectedInputMouseOptions.LeftUp;
inputInjector.InjectMouseInput(new[] { infoDown, infoUp });
Note: If you use the input injection APIs, you need to add inputInjectionBrokered Capabilitiy in your Package.appxmanifest.
But this Capabilitiy is a restricted Capabilitiy, you can’t publish this app in store, which can’t pass the verification.
I've been in discussions with a WebView software engineer. The problem is that the separate process still wants to own focus if you try to move the focus away from the webview. His solution is to ask the other process' web engine to give up focus with the following call:
_= webView.InvokeScriptAsync("eval", new string[] { "window.departFocus('up', { originLeft: 0, originTop: 0, originWidth: 0, originHeight: 0 });" });
You can call it before trying to change the focus to your target. I ran various tests and it works consistently.
I have created an instance of UIAccessibilityElement in order to provide a set of custom actions together with some additional information (i.e. accessibilityLabel + accessibilityHint)
The problem is that VoiceOver doesn't announce the existence of custom actions. They are there, they work, but don't get announced. Also, custom actions' hint is not being announced as well.
Any ideas?
Code to generate the element is below:
private lazy var accessibilityOverviewElement: UIAccessibilityElement = {
let element = UIAccessibilityElement(accessibilityContainer: self)
element.accessibilityLabel = viewModel.accessibilityOverviewTitle
element.accessibilityHint = viewModel.accessibilityOverviewHint
element.isAccessibilityElement = true
let close = UIAccessibilityCustomAction(
name: viewModel.accessibilityCloseActionTitle,
target: self,
selector: #selector(self.accessibilityDidClose))
close.accessibilityHint = viewModel.accessibilityCloseActionHint
let expand = UIAccessibilityCustomAction(
name: viewModel.accessibilityExpandActionTitle,
target: self,
selector: #selector(self.accessibilityDidExpand))
expand.accessibilityHint = viewModel.accessibilityExpandActionHint
element.accessibilityCustomActions = [close, expand]
return element
}()
I compute the element's frame in viewDidLayoutSubviews()
override func viewDidLayoutSubviews() {
super.viewDidLayoutSubviews()
var frame = view.bounds
frame.size.height = SleepAidMinifiedPlayerViewController.defaultHeight
accessibilityOverviewElement.accessibilityFrameInContainerSpace = frame
}
Finally, I need to be able to enable/disable accessibility since this view controller slides from the bottom and hides, but it's not completely removed from the view hierarchy (so VoiceOver still focuses on its elements)
func setAccessibility(enabled isEnabled: Bool) {
view.accessibilityElements = isEnabled ? [accessibilityOverviewElement, /* + other accessible elements*/].compactMap { $0 } : []
}
Thanks!
Any ideas?
I already created a radar about this problem: VoiceOver doesn't read out the custom actions anymore - Nov 4, 2019 at 5:01 PM – FB7426771.
Description: "Natively, in iOS 13, VoiceOver doesn't announce available actions even if they're present: example in the alarms settings, select an alarm and no actions is read out (it's OK in iOS 12) while they exist.
Moreover, if I create an element in an app with custom actions, they won't be announced in iOS 13 but they can be used if I know they're here (up and down swipe to get them).
However, if i use an older app targeting iOS 12, my elements containing custom actions are perfectly spelled out with the "actions available" announced with an iOS 12 device while the iOS 13 device does announce them 'sometimes'.
Please correct this huge turning back in the next iOS 13.3 version because it's extremely penalizing for the VoiceOver users."
No answers since but it's important to deliver a solution in a future version: I'm looking forward to seeing this correction in the next release notes.
However, your implementation should make your app work as desired, that's not the problem in my view ⇒ there are many useful examples (code + illustrations) if you need further explanations about some VoiceOver implementations.
Make your app run under iOS 12 and notice that it works while it's not the case under iOS 13.😰
⚠️ ⬛️◼️🔳▪️ EDIT ▪️🔳◼️⬛️ ⚠️ (2020/03/17)
The problem is that VoiceOver doesn't announce the existence of custom actions. They are there, they work, but don't get announced. Also, custom actions' hint is not being announced as well.
Even if you didn't mention your iOS version you're working with, I think this is iOS 13 because this weird behavior has been introduced making itself scarce in this version: no WWDC videos or info on the Apple website. 😤
This dedicated a11y site mentioned this modification ⟹ "iOS 13 introduced a new custom actions behavior: the "actions available" announcement isn't always present anymore.
It was previously offered to every element containing custom actions but, now, it will occur when you navigate to another element that contains a different set of actions.
The purpose is to prevent repetitive announcements on elements where the same actions are present as the previous element." 🤓
Take a look at this SO answer that highlights a response from a Technical Support Incident about this subject. 😉
Conclusion: if you need to use the announcement of the custom actions on each element they're implemented, use iOS 12 otherwise you'll have to work with this new behavior that wasn't explained anywhere and is definitely not efficient for the VoiceOver users ⟹ the Apple Technical Support claims that's the way it works from now.😰
⚠️ ⬛️◼️🔳▪️ EDIT ▪️🔳◼️⬛️ ⚠️ (2022/11/15)
I haven't this problem anymore, even in iOS 15. 🥳
If you're still in the same still bad situation in iOS 16, I suggest to check you've ticked the box Accessibility-VoiceOver-Verbosity-Actions-Speak in your device settings to make it work as expected (⟹ source). 👍
However, I've had no news from Apple regarding my TSI. 😵💫
I have a Unity UI's input field and a text box. When I use Input.GetKeyDown (KeyCode.Return), it only works on the OS X and PC build and not on the iOS build. iOS keyboard's Return key does nothing. I have tried the events, too, but it doesn't work even then.
Somebody please tell me the solution to this problem if there is any?
While I can't think of a way to harness the return key directly on iOS, there is a way to do so with the "Submit" key using the TouchScreenKeyboard class in Unity
Specifically, it has a variable TouchScreenKeyboard.done to indicate whether the user has pressed the "Submit" (or equivalent) button on any mobile device (iOS, Android WP)
You can also check the wasCanceled variable to see whether the user canceled the input.
Example
public class TouchKeyboardExample : Monobehaviour {
private TouchScreenKeyboard touchScreenKeyboard;
private string inputText = string.Empty;
void Start () {
touchScreenKeyboard = TouchScreenKeyboard.Open(inputText, TouchScreenKeyboardType.Default);
}
void Update () {
if(touchScreenKeyboard == null)
return;
inputText = touchScreenKeyboard.text;
if(touchScreenKeyboard.done)
Debug.Log("User typed in "+inputText);
if(touchScreenKeyboard.wasCanceled)
Debug.Log("User canceled input");
}
}
I've never tried this on IOS, so I'll just guess here.
Are you using the new Unity UI that was introduced in Unity4.6 / Unity5? If so, you might want to use the UI EventSystem, which you probably have somewhere in scene already (it is being added automatically when you add new Canvas object). If you don't have it in scene, add it via menu GameObject->UI->Event System.
In the EventSystem game object, there's a component called Standalone Input Module, where you can then define Submit Button property - which is mapped to Unity's Input Manager (Edit->Project Settings->Input).
On the individual UI element (i.e. InputField in your case), you can now add EventTrigger component, which can listen to Submit event and call a custom method, even pass it some data (e.g. itself, as InputField parameter of the method).
You can also listen to many more events this way (select, hover, drag, etc).
this works fine for me (PC/Mobile), try it out
this.yourInput.onSubmit.AddListener(delegate {
if (this.yourInput.text.Length > 0)
// do something here after enter (PC) or done (mobile)
});
I'm developing a mobile application using Backbone, jQueryMobile and Phonegap. The app works great on Android, iOS and BB >= 6, but on BB5 as expected there are countless issues coming up.
I'm now facing problems with Backbone itself. I'm debugging it and looks like the problem is in the routes definition. The application crashes on start time due to something related to it (still investigating, but debugging is painful for BB5...).
Also, I read that BB5 won't play nice with hash listening, which Backbone relies on to do the navigation, so I am wondering if somebody has been able to create a backbone app on OS5, or is it simply not possible?
I'm updating this question just in case someone faces the same issue:
Short story: it's not possible to run Backbone on OS5. I debugged into backbone and some instructions with regular expressions were causing a crash. Even if these are fixed in the future, we determined that the js support was simply not good enough and finally discarded the OS5 version.
It is probably not worth it in most cases but this is doable.
I managed to get an app running after quite a bit of work - the javascript support is really not great in OS 5.0 and debugging is very very slow as suggested in bfcapell's answer.
To get backbone to work you need to comment out the code that uses the hashchange event to handle url changes (this is assuming that the router is being used). THere is a fallback in backbone which uses setinterval to poll for changes.
// Depending on whether we're using pushState or hashes, and whether
// 'onhashchange' is supported, determine how we check the URL state.
/*if (this._hasPushState)
{
alert('pushstate');
$(window).bind('popstate', this.checkUrl);
} else if (this._wantsHashChange && ('onhashchange' in window) && !oldIE)
{
alert('hashchange');
$(window).bind('hashchange', this.checkUrl);
} else if (this._wantsHashChange)
{*/
this._checkUrlInterval = setInterval(this.checkUrl, this.interval);
//}
The foreach method in underscore also needs to be modified to not use the native foreach method. This is needed for collections to be rendered correctly.
var each = _.each = _.forEach = function (obj, iterator, context)
{
if (obj == null) return;
/*if (nativeForEach && obj.forEach === nativeForEach)
{
obj.forEach(iterator, context);
}
else*/
if (obj.length === +obj.length)
The above should get at least backbone mostly working. (I say mostly because I have a completely working app but I suspect to find a couple more OS5 specific issues in time).
I'm using Instruments for iOS automation and I can't seem to figure out how to tap options on the copy/paste menu. When I do a logElementTree(),I see that we are returning a UIEditingMenu and then three elements (which correspond to options of that menu, such as copy/paste, etc..). I am attempting to place this into a variable, and then trying to "tap" that variable but I cannot get that to work. Here is a sample of my code:
var target = UIATarget.localTarget();
var app = target.frontMostApp();
var window = app.mainWindow();
//This generates the highlighted text
app.dragInsideWithOptions({startOffset:{x:0.45, y:0.6}, endOffset:{x:0.45, y:0.6}, duration:1.5});
var copy = app.editingMenu.elements.withName("copyButton");
copy.tap();
Instruments returns, "0) UIAElementNil". In addition to the above, I've also tried:
app.elements.withName("copyButton")
window.elements.withName("copyButton")
So, I can get the editingMenu to produce the available options, but I cannot figure out a way to tap or select one of those options. I'm not quite sure I know how to reference those options to begin with.
Does anyone have any ideas?
Thanks!
You should try app.editingMenu().elements()[index].tap() where index is the index of the option you want to tap from the array of elements returned. I got my one working this way.
Hey.
First of all, I was always using .elements() not .elements... but it is JS, so it may be invoking function that is assigned to object property..?
Anyway, maybe this edit menu is not internal window of the app, but it is system level menu, that is invoked, when you do the drag? If that is true, try:
UIATarget.localTarget().frontMostApp().elements().withName("copyButton").tap();
But as I see in apple reference your version with calling app.editingMenu() should be fine...
Maybe try calling buttons by position, and you will see which respond:
UIATarget.localTarget().frontMostApp().editingMenu().elements()[0].tap;
UIATarget.localTarget().frontMostApp().editingMenu().elements()[1].tap;
UIATarget.localTarget().frontMostApp().editingMenu().elements()[2].tap;
You should find position of correct one this way. When you have it's position you can check its properties by button.logElement();. With this inf you you should be able to switch back to .withName method instead hardcoded position.
I did this similar to yoosiba but with editingMenu element names.
Using Xcode 4.5.1 and device running iOS 6.
Using Alex Vollmer's excellent tuneup_js for target, app and vtap().
Otherwise you can use UIATarget.localTarget().frontMostApp() and tap().
NOTE: vtap() will delay and retry tapping. Without this you may need to add your own delays.
// tap in textFieldA to see editingMenu.
app.mainWindow().textFields()["textFieldA"].vtap();
app.editingMenu().elements()["Select All"].vtap();
app.editingMenu().elements()["Copy"].vtap();
// must delay before attempting next tap
target.delay(2);
// ... navigate to different section of the app
// tap in textFieldB to see editingMenu.
app.mainWindow().textFields()["textFieldB"].vtap();
// paste clipboard contents copied from textFieldA into textFieldB
app.editingMenu().elements()["Paste"].vtap();
target.delay(2);