Nuance's DragonMobile component apparently turns off VoiceOver announcements between the initial call to SKRecognizer's initWithType:detection:language:delegate and the component's call to recognizerDidFinishRecording:. It makes some sense that they do this, since they don't want the VoiceOver announcements to be picked up by the mic and transcribed.
The problem is that there's usually a 1-2 second gap between the initialization of the recognizer and the initial call to recognizerDidBeginRecording:. In order to prevent the user's first few words from getting cut out of the transcription, it's necessary to use recognizerDidBeginRecording: to indicate to the user that they should start speaking (i.e. you can't just have them hit the mic button and start speaking immediately).
My problem is that since DragonMobile turns off VoiceOver as soon as initWithType: is called, I have no way of indicating to a VoiceOver user that they should begin talking at the appropriate time.
Found something of a workaround: DragonMobile allows you to specify SKEarcons, which are audio files that play whenever recording is started, stopped or canceled. I'm going to record VoiceOver making the announcements that I need and then use these recordings as the earcons, so that it will sound like the rest of VoiceOver.
According to a Nuance technical rep I just spoke to, DragonMobile does indeed take over the audio layer and suppress any output during recording, and they don't expose any way around this other than the earcons.
Related
I'm working on an application that is designed to speak some information about the data it captures from video as the video is being captured. Right now I'm using UIAccessibilityPostNotification to get VoiceOver to say what I want it to say with UIAccessibilityAnnouncementNotification. This typically works great until the user attempts to navigate between my UI controls. After swiping back and forth along the elements (a menu and an info button) a little bit, the application stops speaking the persistent notifications. I also find that posted notifications do not announce if I background the app and then return it to the foreground
I have a magic tap handler that pauses and resumes that persistent announcement and once it is triggered (which also triggers speech about the last rendered info), manually triggered speech works again and upon resume from a second magic tap the announcements continue as if nothing had stopped.
Is there a mechanism to get voiceover to reliably resume speaking without requiring some other kind of user-screen input in between?
Fortunately this was, following the rule of "it's probably your own code's fault first", my own fault. A timer was inadvertently changing the state of things and what I was using to track when to announce was being set to an invalid state which stopped the announcements.
This is my first post here on stackoverflow so forgive me for anything I'm doing wrong.
I'm making a kind of guide to the users without any computer knowledge of my application where I show him how to use it by signalizing what he should do, more specifically where to click. I want to that by moving a "fake" cursor to the button and simulate a click, and here is where I got my problem, I have to simulate just the animation of the click, and not the event itself but I couldn't find a way to do that, can anyone help me?
What you're describing is exactly what WH_JOURNALPLAYBACK is for. It populates the message queue with the mouse and keyboard messages you want to occur, and the OS interprets them. In your case, activate the playback hook and perform the mouse events necessary for performing a click.
In preparation, you'll probably want to use WH_JOURNALRECORD to discover what messages you need. Once you have them, you can probably winnow them down to a reasonably sized list prior to shipping your product to customers. (In particualr, you'll probably record many more mouse-move messages than you really need.)
In your button's click handler, check whether playback is active. Only perform the rest of the event handler when playback isn't active. That way, your program will behave just as though the button were clicked (including any animation), but it won't execute the real event code.
Is it possible to record set of touch events on iPhone and then playback?
I have searched alot but could not find any answer. if its possible, can anyone explain with an example.
I m not looking for testing purpose. Within my application, instead of creating animation, i just want to record set of events and then want to playback to explain the app flow to the users.
Regards.
Recording is pretty simple. Look at the various "Responding to Touch Events" and "Responding to Motion Events" methods on UIResponder. Just create your own UIView subclass (since UIView inherits from UIResponder) and keep a copy of the events passed into the relevant methods.
Playback is a bit more complicated; there's no way to make UITouch or UIEvent objects (so you can't make a fake event and pass it on to -[UIApplication sendEvent:]). But, there's nothing stopping you from manually parsing an array of Event objects and handling it on your own (aside from it being some kind of ugly code).
There's no built-in macro capability, but you could certainly build that ability into your application. You'll need to do more than just play back events, though. Touches aren't normally visible, but if you're trying to explain how to use your app to the user you'll probably want to have some sort of visual representation for the touches that trigger different responses similar to the way the iOS Simulator uses white dots to represent multiple touches when you hold down the option key.
Assuming that you can solve that problem, two strategies for easily recording user actions come to mind:
Use the Undo Manager: NSUndoManager is already set up to "record" undoable events. If you invest some time into making everything in your app undoable, you could (maybe) perform a set of actions, undo them all to move them to the redo stack, and then save the events in the redo stack as your script.
Use Accessibility: The Accessibility framework sends notifications whenever user interface elements are touched. Your app could use those notifications to create a playback script. You'll still need to write the code to play back the events in the script, though.
You could mirror your application with AirServer and use any screen capture software to make the video.
While playing a video, I'm seeing rate change notifications from AVPlayer that don't seem to be connected to app activity.
When my app receives a UIApplicationDidEnterBackgroundNotification notification, I tell the AVPlayer to pause. The logic is that it should come back to the foreground at the same place the user left. If I do not call pause when going to the background, the problem doesn't appear.
The sequence of events sent to the player is pause, seekToTime:, play. Generally, this works fine but, after the app has been sent to the background and then returned to the foreground, each play invocation results in two rate changes from the AVPlayer. The first is to 1 and the second, immediately following, is to 0. This pattern continues for each call to -[AVPlayer play] as long as that player instance is in use.
I'm able to put a breakpoint on -[AVPlayer pause] and I do not see it being hit when the rate changes to 0. If I comment out the seekToTime: call, the problem goes away. If I use seekToTime:completionHandler:, I also get the same problem although my block's finished parameter is YES.
Apart from "how do I fix this", I'm interested in any details about how to detect the reason for rate changes in AVPlayer that aren't connected to play/pause. (Putting a breakpoint on -[AVPlayer setRate:] never seems to trigger.)
(One workaround that almost works is to save the player position when entering the background, let it play, and fix the position when returning to the foreground. This also requires some manipulation of audio levels, which is probably doable, but another problem is that not all background notifications indicate that the view has been obscured (e.g. double-tap home button). This leads to cases where my workaround shows a distracting moving image when the app is interrupted but still visible.)
Suggestions?
(A last bit of extra information: In all the cases I've tried, I eventually get to a state where the AVPlayer is changing the rate from 1 to 0 moments after I invoke 'play' if I've returned from the background and then performed a seek. There are things I can do to make it less frequent but none that eliminate it except getting rid of the AVPlayer and creating a new instance. This results is very long delays but is better than a complete malfunction...I guess.
I have some evidence that the seek distance affects the result, which suggests that the error might be in the underlying buffering mechanism. Without knowing what causes the rate change (other than play/pause) I don't see a way to investigate further.)
You are probably getting an AVPlayerItemPlaybackStalledNotification.
Try this:
[[NSNotificationCenter defaultCenter]
addObserverForName:AVPlayerItemPlaybackStalledNotification
object:cell.avPlayerItem
queue:[NSOperationQueue mainQueue]
usingBlock:^(NSNotification *note) {
DDLogVerbose(#"%#", #"AVPlayerItemPlaybackStalledNotification");
}];
However, Apple docs say Playback will continue once a sufficient amount of media has subsequently been delivered. (https://developer.apple.com/library/iOS/documentation/AVFoundation/Reference/AVPlayerItem_Class/Reference/Reference.html#//apple_ref/doc/uid/TP40009532-CH1-SW83)
This is not happening for me or you.
I am still tracking the reason down.
EDIT:
This was the problem for me. When AVPlayerItem.likelyToKeepUp was YES then it wouldn't happen.
Not sure why it wasn't resuming.
EDIT:
From the Apple docs there is a situation where playback will not resume:
This property communicates a prediction of playability. Factors
considered in this prediction include I/O throughput and media decode
performance. It is possible for playbackLikelyToKeepUp to indicate NO
while the property playbackBufferFull indicates YES. In this event the
playback buffer has reached capacity but there isn't the statistical
data to support a prediction that playback is likely to keep up in the
future.
It is up to you to decide whether to continue media playback.
I am developing a tic tac toe game for iOS and I am using a combination of UIButtons and UIImageViews to allow for user interaction and to display the moves made. My problem is that the buttons continue to accept user input before the cpu makes it's move, which breaks my game logic. I have made several attempts to toggle the userInteractionEnabled property, but I have only been able to turn it off. The engine that gets everything started in the game is my buttonPressed method. I also toggle the userInteractionEnabled property within this method and therein lies my problem: How do I re-enable the property after disabling user interaction? Is there a method that is called in between events that I can overwrite?
I have searched the web and I have searched through the developer documentation provided by Apple and I found information on the touchesBegan and touchesEnded methods. However, from what I understand, those methods need to be explicitly called which brings me back to my original problem of not being able to call those functions without the user's interaction.
If anyone can help, I would greatly appreciate it! I have been racking my brain over this for the past couple of weeks and I am just not seeing a solution.
I'd think that for a game like tic-tac-toe, calculating the countermove should be so fast that it can be done immediately in response to the first button press. If you're doing something complicated to calculate the next move, like kicking off a thread, you might want to reconsider that.
Let's say, though, that your game is something like chess or go, where coming up with a countermove might take a bit longer. Your view controller should have a method to make a move for the current player, let's call it -makeMove:. Your -buttonPressed action should call that method to make a move for the user. In response, -makeMove: should update the state of the game, switch the current player to the next player. If the new current player is the computer, it should then disable the controls and start the process of calculating the next move. Let's imagine that's done by invoking some NSOperation, so that coming up with the next move is an asynchronous task. Once the operation has come up with a move, it should again invoke -makeMove: (by calling -performSelectorOnMainThread:), which will again update the game state and the current player. This time, though, it should see that the new current player is not the computer, and so it should re-enable the controls.