Show Large 3D Touch Shortcut Widget - ios

In iOS 10, when 3D touching on your app, a widget for the app appears along with 3D touch shortcuts. That widget is automatically the small version of your app's widget; is there any way to make that widget the large version of the widget (which is normally viewed by pressing Show More on the widgets screen)?
How can I make the widget that appears when 3D touching my app the large version of my app's widget (which normally appears when pressing Show More)?

The 3D Touch widget height is a system-level restriction. (You’ll notice that even Apple doesn’t override it for first-party apps.)
This is because the Quick Action menus themselves can get pretty tall, and widgets’ heights are effectively unlimited. And because one possible method of interacting with the Quick Action menu is sliding your finger up/down from where you pressed, scrolling is obviously out of the question.
As for how to work around this by getting rid of the widget altogether, it seems that iTunes Connect checks the value for the UIApplicationShortcutWidget key to ensure that the given bundle identifier actually exists and rejects the app if it doesn’t. The UIApplicationShortcutWidget key is officially defined so that, if an app has multiple widgets, it can choose which to show in the 3D Touch context.
Until Apple reverses this policy—and I wouldn’t hold my breath since this is something of an edge case—your only workarounds are to have that ignorable widget, or to reconsider the widget experience altogether.
Personally, I’d recommend reconsidering the widget altogether, since Apple recommends that widgets aren’t just “launch buttons” as you suggest, even when in the small size. Per the Human Interface Guidelines, widgets are to be used for “glanceable” information or simple interactions outside the app. Is there other useful information/functionality that you could place in the top 110 points in place of the launch button?
In addition, of course, you can always file a bug as an enhancement to see if Apple would be willing to entertain the idea. I suspect that it would involve the addition of a separate Info.plist key, probably a Boolean telling iOS whether a widget is desired in that context at all.

It's actually impossible. The short version of your widget is displayed when 3D Touching your app icon.
It also gives you the possibility to add it to your widget center. It's only there that you can see the full version by pressing the show more button.

Related

SwiftUI: Accessibility sound feedback for a draggable element

I am making an application that works essentially like a simple Drag-and-Drop Playground with the command blocks on the left and a droppable area on the right. I want to make it fully compatible with VoiceOver and I'm running into trouble with some of the accessibility aspects since this is my first Swift application.
This is what the playground currently looks like: (App Screenshot)
My goal is to provide the users with audio cues/feedback while they are dragging the elements to help them figure out what part of the screen they are currently at. The ideal functionality would be exactly like what one uses when editing an iOS device's Home screen (the arrangement layout of the apps).
When trying to rearrange apps on the home screen with VoiceOver enabled, you hear a row/column alert when you are dragging an app over an open area. I want a similar type of feedback that says "Droppable Area" when you are over the correct area (see scenario 1).
When trying to rearrange apps on the home screen with VoiceOver enabled, you hear a sound when you tap on an area that has no app icon. (This also happens when you are not editing the layout and simply tap on an open area with no app.) I want that noise to be what you hear when you drag a command over an area that is not droppable (see scenario 2).
Any ideas on how this might be possible or good references to look at?

How can I know when the split-window drag handle is present on iPad?

When an app is running in a third of an iPad screen, there is a small drag handle at the top of its window. In iOS 10, dragging on that handle lets you switch what app is open there. In iOS 11, you can use it to change the app from taking up a third of the screen to floating over the rest of the screen.
My question: how do I know when this handle is present, or at least know that there's something taking up that space? I need to lay out my UI content around it without conflicting with it. It doesn't appear to work with iOS 11's Safe Area APIs.
See here for a sample project trying to put a label at the top of a window without overlaying the drag handle. Run it in a third of an iPad screen.
Start by duplicating the radar. This is definitely something that should be handled by the safe area magic.
The issue here is that the handle is rendered by SpringBoard, so you can only apply tricks to guess whether it is currently visible. You can determine whether the window is at the right size and whether it is at the correct location on screen, and then add some extra safe area insets. This is normally ill-advised for several reasons, such as not knowing all cases where the handle appears, having to take into account left-to-right systems, etc., but in this case, the problem seems so egregious, I'm not sure I'd recommend leaving as is.
Edit
One more option is to see if UIWindow.safeAreaInsets returns a correct value. UINavigationController is able to deduce the safe area correctly, so it is hiding there somewhere.

Can you hide app widgets from appearing under the 3D touch quick actions?

I'd like to remove my apps Today Widget from appearing when 3D touching the app icon on the home screen because showing the quick actions and the Today Widget just doubles up the actions app provides (seen in screenshot). Is there anyway to hide widgets from the 3D touch quick actions using the Info.plist or some other method?
Thanks
So, if you have multiple homescreen widgets, you can set UIApplicationShortcutWidget to determine which one should be shown. I haven't been able to find a way to disable them altogether from appearing in the 3d touch shortcut menu.
One option you may be able to try (this works for me on the simulator) is to set this value in your Info.plist to the value of nil.
<key>UIApplicationShortcutWidget</key>
<string>nil</string>
Your other option is to create a different widget, with more useful functionality, perhaps such as showing the currently running timer, or other stats.
Apple released this has a new feature, where user can see the handy information when force touching the App icon for shortcut items. I presume we cannot remove widget during 3D touch quick actions.
Unfortunately, there's now way to hide app widget. If you set UIApplicationShortcutWidget to nil, your app will get rejected.

Actual difference between UIAccessibilityLayoutChangedNotification and UIAccessibilityScreenChangedNotification?

I’m trying to ascertain what exactly happens differently when posting a UIAccessibilityLayoutChangedNotification, and a UIAccessibilityScreenChangedNotification. From what I can see, I can use them interchangeably everywhere and nothing different happens.
The Apple documentation simply says to use LayoutChanged when (for example) an element has been hidden or shown, and to use ScreenChanged if the entire screen changes, but I’m interested in what THEY do when I provide this information, and what I should see differently when using one or the other.
Can anyone give a clear explanation of implementation differences between the two?
These two notifications are for dynamic content on views, and communicating these changes to VoiceOver for screenreader users. There is little difference between these two notifications, except for their default behavior, and the silly little "boop beep" for ScreenChange notifications.
In both instances, the argument
UIAccessibilityPostNotification(UIAccessibilityLayoutChangedNotification, arg);
Represents a string to be read out, or an on screen element, which VoiceOver will shift its focus to. In the event of dramatic context changes, it is important to send focus to a place that makes sense, or announce that such changes have taken place. Either approach is acceptable from an accessibility point of view, though I prefer approaches that involve the least amount of change possible. In the event of simple layout changes, it is almost always best just to announce the context change, and leave focus where it was. Though sometimes, the element that caused the context change is hidden, and then it is clearly necessary to direct voiceover to highlight new content, because the default behavior in this case is undefined, or perhaps deterministic, but determined by a framework that knows absolutely nothing about your app!
The difference between the two events, given that they both do exactly the same thing, is in their default behavior. If you supply nil to the UIAccessibilityLayoutChangedNotification it is as if you have done nothing. If you supply a nil argument to the UIAccessibilityScreenChangedNotification it will send focus to the first UIObject in your view hierarchy that is marked as an accessibilityElement, once all view hierarchy changes and drawings are complete.
UIAccessibilityLayoutChangedNotification
A good use case example for UIAccessibilityLayoutChangedNotification is for dynamic forms. You want to let users know that, based on decisions they've made in the form, new options are available. For example, if in a form you select that you are a Veteran, additional areas of the form may pop up to provide more input, but these areas may have been hidden to other users who did not care about them. So you could shift focus to these elements after user interaction:
UIAccessibilityPostNotification(UIAccessibilityLayoutChangedNotification, firstNewFormElement);
Which would shift focus to the provided element, and announce it's accessibilityLabel.
Or just tell them that the new form elements are there:
UIAccessibilityPostNotification(UIAccessibilityLayoutChangedNotification, #"Veterans form elements available");
Which would leave focus where it is, but VoiceOver would announce "Veterans form elements available".
Note: This particular behavior is bugged on my iPad (8.1.2).
Or finally you could do this:
UIAccessibilityPostNotification(UIAccessibilityLayoutChangedNotification, nil);
Which does absolutely nothing :). Seriously, I don't even think the a11y framework backend cares. This particular line of code is a complete waste!
UIAccessibilityScreenChangedNotification
A good use case example for the UIAccessibilityScreenChangedNotification is customized tabbed browsing situations. When the entire screen, with the exception of your navigation area, changes. You want to let voiceover know that essentially the entire screen changed, but NOT to focus the first element (your first tab) but to focus the first content element.
UIAccessibilityPostNotification(UIAccessibilityScreenChangedNotification, firstNonGlobalNavElement);
Which would play the "boop beep" sound and then shift focus to just beneath your global navigation bar. Or you could do this:
UIAccessibilityPostNotification(UIAccessibilityScreenChangedNotification, #"You're on a new tab");
Which would wait for the new tab to load, play the "beep boop" sound, announce "You're on a new tab" in voiceover, then shift focus to the first element on the screen, then announce the accessibilityLabel for that element. (PHEW! That's a lot! This is jarring for screen reader users. Avoid this scenario, unless absolutely necessary).
And finally you can of course do this:
UIAccessibilityPostNotification(UIAccessibilityScreenChangedNotification, nil);
Which is equivalent to:
UIAccessibilityPostNotification(UIAccessibilityScreenChangedNotification, firstA11yElement);
Both of which will play the "beep boop" sound, shift VoiceOver focus to the first element on the screen, and then announce it.
Finally
In a comment somebody mentioned caching, and I occasionally comment in my answer about things the A11y Backend may or may not care about. While it is certainly possible that there is some backend magic happening, I don't believe in either of these circumstances, the back end cares at all. The reason I say this is because:
If you've ever used the UIAccessibilityContainer protocol, you can watch as your container of views gets queried. There is no caching going on. Even the accessibilityElementCount property gets pinged each time VoiceOver changes focus to a new AccessibilityElement within your container. Then it goes through the process of checking which element it is on, asking for the next element, and so on. It is designed at its core to handle dynamic situations. If you were to insert a new element into your container after interaction, it would still go through all of these queries and be just fine about it! Furthermore, if you override the properties of the UIAccessibility protocol, in order to provide dynamic hints and labels, you can also see that these functions get called every time! As such, I believe that the A11y Framework backend gleans ABSOLUTELY ZERO information from these notifications. The only information VoiceOver needs to do its job is it's currently focused Accessibility Element, and this elements Accessibility Container. The notifications are simply there for you to make your app more usable for VoiceOver users.
Imagine if this weren't the case how many times Safari would post these notifications!!!! :)
These particular statements can only be confirmed by someone with backend knowledge of the framework, who works with the code, and should be viewed as conjecture. It could be the case that this is highly version/implementation dependent. Definitely open to discussion on these points! The rest of this post is pretty concrete.
For Your Reference
Most of this comes from experience working with the frameworks, but here is a useful reference if you wish to dig further.
https://developer.apple.com/documentation/uikit/accessibility/uiaccessibility
https://developer.apple.com/documentation/uikit/uiaccessibilitylayoutchangednotification
https://developer.apple.com/documentation/uikit/uiaccessibilityscreenchangednotification
And finally, an open source repo of the silly little app I put together to test all this stuff.
https://github.com/chriscm2006/IOS-A11y-Api-Test
UIAccessibilityScreenChangedNotification is to indicate that the whole screen has changed and VoiceOver should reset.
UIAccessibilityLayoutChangedNotification is to indicate that one or more, but not all, elements on the screen have changed.
when your UI changes dramatically. Usually when a user moves into a different part of your app (navigates to a different screen). VoiceOver notifies the user with a tone, and it clears its caches and does other preparations to deal with a new set of accessibility data. It alerts VoiceOver that the screen has changed and there may be new elements on the screen so VoiceOver will rebuild it's index of accessibility elements.
UIAccessibilityPostNotification(UIAccessibilityScreenChangedNotification, nil);
If some part of your UI changes, but the user hasn’t necessarily jumped to an entirely different part of your app. (Example: in the iTunes Store app, tapping on the price label ($0.99, etc.) next to a song changes it to a “Buy” button.) This notification tells VoiceOver to re-read the current state of all accessible items that are on-screen, and by doing this it figures out what has changed and informs the user of those changes. It alerts VoiceOver that the layout has changed and that it's current index is out of date because the items on the screen have reordered themselves.
UIAccessibilityPostNotification(UIAccessibilityLayoutChangedNotification, nil);

Ios default icon for open view from list

In every ios app you can have a list that contains elements with subelements.
Tapping on the element will open a new page and you can usually press back to return.
This is indicated with a grey ">" symbol on the right.
Is this symbol downloadable somewhere? I know i can just type a > but it doesn't look exactly like the default icon used by ios.
I'm using Xamarin dialog and a standard RootElement embedded as a list item looks exactly like the default ios but i need to customize it with an icon placed left of the text(which is no problem except that i now lose the default > icon).
Googling for ios system icons, ios default icons and ios sdk did not yield the wanted result. I'm hoping that these icons are somewhere embedded on the device.
I hope you guys can help me out, thanks !
As far as I know there is no way to access a UIImage instance of the chevron during run time. Most likely there is private API for this, but I am not aware of it, and since it's private you are not allowed to use it anyway.
You could probably instantiate a cell that has the disclosure indicator as accessoryType and walk the view hierarchy to find it. But that will break easily, so don't do it.
The best way is to add an image and update it with every new iOS release.
There's the iOS Artwork Extractor which basically gets you every piece of artwork that is used in iOS.
The artwork you are looking for should be named UITableNext. (at least that's the name in iOS6, I don't have an extracted archive of iOS7 yet)
Strictly speaking you are violating Apples rules and their copyright if you use their artwork without Apples written consent.
As far as I know this has never been enforced, and lots of people do it, but it's good to keep it in mind.

Resources