I want to create a custom keyboard, and I want to add more characters on it. if user long press a button it show more characters. like IOS default keyboard shows popup button after long press.
I created a very complex keyboard a few months ago where every key needed to have at least 5 different variations (Amharic dialect). I found that many pop ups do not work very well as they don't pop up over the top of the keyboard, I looked at how other keyboards achieved this but couldn't find a way to make it work. You can see a question on this problem here.
As a work around I created my custom keyboard with an extra row on the top, this was normally filled with numbers but on a long press would switch to show the alternative keys available. On long press the key pressed would be added to the field and if they chose on of the alternatives it would then replace the first key.
To give you a better idea here are some images:
Regular Keyboard:
Long press:
NOTE: Excuse the poor quality but I could only find an intermediate version of the project to screen print.
While working on this project I found that pop ups were a lot more difficult to achieve than this. I researched creating my own pop ups with bezier paths and also using pop ups themselves with the CYRKeyboard Button. (Note although the CYR gif shows the exact functionality required I found this gif to be extremely misleading). But in the end I came back to the above solution which worked very smoothly, quickly and easily.
Hope this helps
This is a quite strange behavior that 'persecutes' me since iOS 7.0 :) I hope someone of you can help me this time! As you probably know when you are using VoiceOver your gestures are totally different from the 'normal way'. When you need to bypass VoiceOver for a specific view you can set its accessibility traits as UIAccessibilityTraitAllowsDirectInteraction. When the view has this parameter set the user can interact with it as usual (like VoiceOver is not active in that particular view).
Quite often happens that this ability is randomly lost so VoiceOver acts in its normal way.
Did anyone of you encounter this problem in its experience? Did he solve it? Fortunately turning off and on VO seems to temporarily solve this issue (until next time it happens again)
Any idea? Thank you very much
I've seen this with other things as well. For example, notifications can be spotty, particular Screen Changed or Content Changed notifications. I believe this happens as a result of turning VoiceOver on and off. For example, if you were to turn VoiceOver on, leave it running, and open your application as a user would, you would never experience these issues.
However, if you use the VoiceOver shortcut. Or interrupt the application, re-install, and restart while using Xcode, you can disrupt the VoiceOver's connection to the application. It doesn't bond correctly. So, simple things like navigation work fine. But advanced features like notifications (and perhaps some of the more complicated traits) don't work.
Essentially, I would classify this as a bug, but a bug that only shows itself when you use VoiceOver in a way that only a developer would use it.
I’m trying to ascertain what exactly happens differently when posting a UIAccessibilityLayoutChangedNotification, and a UIAccessibilityScreenChangedNotification. From what I can see, I can use them interchangeably everywhere and nothing different happens.
The Apple documentation simply says to use LayoutChanged when (for example) an element has been hidden or shown, and to use ScreenChanged if the entire screen changes, but I’m interested in what THEY do when I provide this information, and what I should see differently when using one or the other.
Can anyone give a clear explanation of implementation differences between the two?
These two notifications are for dynamic content on views, and communicating these changes to VoiceOver for screenreader users. There is little difference between these two notifications, except for their default behavior, and the silly little "boop beep" for ScreenChange notifications.
In both instances, the argument
UIAccessibilityPostNotification(UIAccessibilityLayoutChangedNotification, arg);
Represents a string to be read out, or an on screen element, which VoiceOver will shift its focus to. In the event of dramatic context changes, it is important to send focus to a place that makes sense, or announce that such changes have taken place. Either approach is acceptable from an accessibility point of view, though I prefer approaches that involve the least amount of change possible. In the event of simple layout changes, it is almost always best just to announce the context change, and leave focus where it was. Though sometimes, the element that caused the context change is hidden, and then it is clearly necessary to direct voiceover to highlight new content, because the default behavior in this case is undefined, or perhaps deterministic, but determined by a framework that knows absolutely nothing about your app!
The difference between the two events, given that they both do exactly the same thing, is in their default behavior. If you supply nil to the UIAccessibilityLayoutChangedNotification it is as if you have done nothing. If you supply a nil argument to the UIAccessibilityScreenChangedNotification it will send focus to the first UIObject in your view hierarchy that is marked as an accessibilityElement, once all view hierarchy changes and drawings are complete.
UIAccessibilityLayoutChangedNotification
A good use case example for UIAccessibilityLayoutChangedNotification is for dynamic forms. You want to let users know that, based on decisions they've made in the form, new options are available. For example, if in a form you select that you are a Veteran, additional areas of the form may pop up to provide more input, but these areas may have been hidden to other users who did not care about them. So you could shift focus to these elements after user interaction:
UIAccessibilityPostNotification(UIAccessibilityLayoutChangedNotification, firstNewFormElement);
Which would shift focus to the provided element, and announce it's accessibilityLabel.
Or just tell them that the new form elements are there:
UIAccessibilityPostNotification(UIAccessibilityLayoutChangedNotification, #"Veterans form elements available");
Which would leave focus where it is, but VoiceOver would announce "Veterans form elements available".
Note: This particular behavior is bugged on my iPad (8.1.2).
Or finally you could do this:
UIAccessibilityPostNotification(UIAccessibilityLayoutChangedNotification, nil);
Which does absolutely nothing :). Seriously, I don't even think the a11y framework backend cares. This particular line of code is a complete waste!
UIAccessibilityScreenChangedNotification
A good use case example for the UIAccessibilityScreenChangedNotification is customized tabbed browsing situations. When the entire screen, with the exception of your navigation area, changes. You want to let voiceover know that essentially the entire screen changed, but NOT to focus the first element (your first tab) but to focus the first content element.
UIAccessibilityPostNotification(UIAccessibilityScreenChangedNotification, firstNonGlobalNavElement);
Which would play the "boop beep" sound and then shift focus to just beneath your global navigation bar. Or you could do this:
UIAccessibilityPostNotification(UIAccessibilityScreenChangedNotification, #"You're on a new tab");
Which would wait for the new tab to load, play the "beep boop" sound, announce "You're on a new tab" in voiceover, then shift focus to the first element on the screen, then announce the accessibilityLabel for that element. (PHEW! That's a lot! This is jarring for screen reader users. Avoid this scenario, unless absolutely necessary).
And finally you can of course do this:
UIAccessibilityPostNotification(UIAccessibilityScreenChangedNotification, nil);
Which is equivalent to:
UIAccessibilityPostNotification(UIAccessibilityScreenChangedNotification, firstA11yElement);
Both of which will play the "beep boop" sound, shift VoiceOver focus to the first element on the screen, and then announce it.
Finally
In a comment somebody mentioned caching, and I occasionally comment in my answer about things the A11y Backend may or may not care about. While it is certainly possible that there is some backend magic happening, I don't believe in either of these circumstances, the back end cares at all. The reason I say this is because:
If you've ever used the UIAccessibilityContainer protocol, you can watch as your container of views gets queried. There is no caching going on. Even the accessibilityElementCount property gets pinged each time VoiceOver changes focus to a new AccessibilityElement within your container. Then it goes through the process of checking which element it is on, asking for the next element, and so on. It is designed at its core to handle dynamic situations. If you were to insert a new element into your container after interaction, it would still go through all of these queries and be just fine about it! Furthermore, if you override the properties of the UIAccessibility protocol, in order to provide dynamic hints and labels, you can also see that these functions get called every time! As such, I believe that the A11y Framework backend gleans ABSOLUTELY ZERO information from these notifications. The only information VoiceOver needs to do its job is it's currently focused Accessibility Element, and this elements Accessibility Container. The notifications are simply there for you to make your app more usable for VoiceOver users.
Imagine if this weren't the case how many times Safari would post these notifications!!!! :)
These particular statements can only be confirmed by someone with backend knowledge of the framework, who works with the code, and should be viewed as conjecture. It could be the case that this is highly version/implementation dependent. Definitely open to discussion on these points! The rest of this post is pretty concrete.
For Your Reference
Most of this comes from experience working with the frameworks, but here is a useful reference if you wish to dig further.
https://developer.apple.com/documentation/uikit/accessibility/uiaccessibility
https://developer.apple.com/documentation/uikit/uiaccessibilitylayoutchangednotification
https://developer.apple.com/documentation/uikit/uiaccessibilityscreenchangednotification
And finally, an open source repo of the silly little app I put together to test all this stuff.
https://github.com/chriscm2006/IOS-A11y-Api-Test
UIAccessibilityScreenChangedNotification is to indicate that the whole screen has changed and VoiceOver should reset.
UIAccessibilityLayoutChangedNotification is to indicate that one or more, but not all, elements on the screen have changed.
when your UI changes dramatically. Usually when a user moves into a different part of your app (navigates to a different screen). VoiceOver notifies the user with a tone, and it clears its caches and does other preparations to deal with a new set of accessibility data. It alerts VoiceOver that the screen has changed and there may be new elements on the screen so VoiceOver will rebuild it's index of accessibility elements.
UIAccessibilityPostNotification(UIAccessibilityScreenChangedNotification, nil);
If some part of your UI changes, but the user hasn’t necessarily jumped to an entirely different part of your app. (Example: in the iTunes Store app, tapping on the price label ($0.99, etc.) next to a song changes it to a “Buy” button.) This notification tells VoiceOver to re-read the current state of all accessible items that are on-screen, and by doing this it figures out what has changed and informs the user of those changes. It alerts VoiceOver that the layout has changed and that it's current index is out of date because the items on the screen have reordered themselves.
UIAccessibilityPostNotification(UIAccessibilityLayoutChangedNotification, nil);
I want to know how to implement a keylogger for iOS jailbroken iPhones and prevent being logged by others.
1. Implementing a keylogger in iOS (jailbreak)
I have thought of two ways to do this,
First way:
I hook _endedEditing method of class UITextField and save the text of the textfield somewhere. The advantage of this way is it's easy and it's not depending on what you typed. So if you have copied some text in the textfield, you would know that too. The problem is It's not called for text boxes in html webpages like google search bar or some programs that there keyboard doesn't hide.
Second way:
I should hook the keyboard buttons press method and save there labels. The keystrokes should be slit every time the keyboard hides. There is a problem here for me: First of all I can't find the right method of the right class to hook. I've found UIKeyboardButton that inherits from UIButton but when I hook any method of that it's not called, I don't know why!
So how can I implement a keylogger for iOS?
2. Preventing an app being keylogged by a third party app/tweak
Now the second part of my question; There is a keylogger in cydia named iKeyMonitor that logs keystrokes and it's good at its job. The suprising thing here is that it can't log whatever is typed in iFile app! I even have tested this with the first way I mentioned above and I can't receive any text when I type anything in iFile. How does iFile do this and how can we do the same thing for our app?
Is there a way to force up a software keyboard when the user has a iOS bluetooth keyboard device installed?
Or, to that end, is it possible in code to disable a specific bluetooth device?
Thanks!
In most (maybe all?) iOS apps with which I have used Apple's bluetooth keyboard, pressing the eject key (located in the top right corner) will bring up the soft keyboard on the screen. Maybe that little factoid could help you in some way.
Not from within the application's code, if you're planning on getting into the app store. Apple expressly does not provide methods to show or hide keyboard, instead pushing you to use becomeFirstResponder and resignFirstResponder.
You may be able to do this through some non-AppStore-friendly methods, but somehow I don't think that's the answer you're looking for.
(Note - you could make a fake, Apple-looking keyboard when the real one is hidden, and check if the real one is hidden based on whether a view is visible, but if Apple notices you doing this, you'll get denied.)