Can I support VoiceOver in my Cocos2D-iPhone Game? - ios

I'm making a game where a player reacts to sounds via motion - seeing as the visual element isn't needed to play it, and many play with their eyes closed, it seems a shame to not be fully VoiceOver compatible. I'm currently using Cocos2D-iPhone and CocosDenshion for audio, and am now starting to think about how I'll be building my menu system to choose levels and configure controls.
Is it reasonably easy to support VoiceOver in Cocos2D's menu system, or should I look in to trying to create my menus in UIKit which I have no experience using?

I don't know if Cocos' menu system supports VoiceOver, but if it doesn't, you could probably add the functionality you're looking for yourself without having to delve into a lot of UIKit work. All you need to do is create a UIView subclass which gets added to your main window when your app starts up. Then use the UIAccessibilityContainer protocol and UIAccessibilityPostNotification calls to allow users to interact with your game via VoiceOver.
The UIAccessibilityContainer protocol lets you inform VoiceOver what interface elements are currently on the screen, their labels, their traits, etc. VoiceOver then uses this information to let users swipe between elements and get feedback on them.
When your game changes state, you can change what that protocol sends back and then issue a
UIAccessibilityPostNotification(UIAccessibilityLayoutChangedNotification, nil)
...to inform VoiceOver that the screen layout has changed. And to just speak something via VoiceOver, say when your game state has changed, you can send a different notification to speak some text:
UIAccessibilityPostNotification(UIAccessibilityAnnouncementNotification, #"Achievement unlocked!");

No need to go with UIKit framework you can go with the cocos2d native methods and class to implement this.
For sound option we have SimpleAudioEngine Which can be used. you can distinguish between sound using its ID which is of type ALuint.
ALuint soundEffectID;
//to start
soundEffectID=[[SimpleAudioEngine sharedEngine] playEffect:#"my sound"];
//to stop
[[SimpleAudioEngine sharedEngine] stopEffect:soundEffectID];
You have to mange these effect and I think your problem will be solved.

Related

Player Modal transition like the one in Spotify

I'm building a music app and I would like to make a transition like the one Spotify (and other music apps like the new Apple Music) does to present its player, a minimized player which expands an covers the main view (modally?) by dragging it or tapping it.
How can I achieve this? Is there any API or idea on how to do this?
You might want to look at open-source LNPopupController. It provides similar presentation behaviour much like the Music app.
I found a couple of cocoacontrols that might be helpful:
The first One and I think the most accurate KNSemiModalViewController, this one used in the National Geographic app its more or less what you need, you would only need to present a full screen view and remove the background animation if you want.
MWWindow Another possible solution.
MJPopupViewController
Now all of this controls dont have the "minimize" function just like spotify, the only one I found with this function is SLParallaxController, but you need to figure out how to change the map and the tableview for the content that you want, or just see how he does the dismiss/minimize animation.

Macro Recording in iOS

Is it possible to record set of touch events on iPhone and then playback?
I have searched alot but could not find any answer. if its possible, can anyone explain with an example.
I m not looking for testing purpose. Within my application, instead of creating animation, i just want to record set of events and then want to playback to explain the app flow to the users.
Regards.
Recording is pretty simple. Look at the various "Responding to Touch Events" and "Responding to Motion Events" methods on UIResponder. Just create your own UIView subclass (since UIView inherits from UIResponder) and keep a copy of the events passed into the relevant methods.
Playback is a bit more complicated; there's no way to make UITouch or UIEvent objects (so you can't make a fake event and pass it on to -[UIApplication sendEvent:]). But, there's nothing stopping you from manually parsing an array of Event objects and handling it on your own (aside from it being some kind of ugly code).
There's no built-in macro capability, but you could certainly build that ability into your application. You'll need to do more than just play back events, though. Touches aren't normally visible, but if you're trying to explain how to use your app to the user you'll probably want to have some sort of visual representation for the touches that trigger different responses similar to the way the iOS Simulator uses white dots to represent multiple touches when you hold down the option key.
Assuming that you can solve that problem, two strategies for easily recording user actions come to mind:
Use the Undo Manager: NSUndoManager is already set up to "record" undoable events. If you invest some time into making everything in your app undoable, you could (maybe) perform a set of actions, undo them all to move them to the redo stack, and then save the events in the redo stack as your script.
Use Accessibility: The Accessibility framework sends notifications whenever user interface elements are touched. Your app could use those notifications to create a playback script. You'll still need to write the code to play back the events in the script, though.
You could mirror your application with AirServer and use any screen capture software to make the video.

iOS Music Controls as Arbitrary Input

I'm reading how to handle the events themselves, and I know I could intercept them and use them for my own design, but would Apple allow an app like that into the AppStore?
http://developer.apple.com/library/IOS/#documentation/EventHandling/Conceptual/EventHandlingiPhoneOS/RemoteControl/RemoteControl.html
You could imagine playing a game and giving the option of physical back/forward buttons mapped to left/right in your game. I mean to use them when the app is active in the foreground, nothing outside the app or in the software control tray.
This article doesn't say specifically that it's not allowed, but it does imply that the events are to control music only.
Can anyone with more Apple/iOS background refer me to a document which specifies the rules about using the buttons?

Testing VoiceOver: how can I verify that my UIAccessibilityLayoutChangedNotification notifications are working?

I'm developing an iOS app that I'd like to make fully accessible. Part of the app involves a sequence when playing cards are dealt, and then after user interaction, the hand ends and the next hand is dealt. When the hand has been dealt, I want to make impaired users aware.
So, after the hand is dealt (a sighted user sees the cards animate into place), I send a UIAccessibilityLayoutChangedNotification notification like so:
UIAccessibilityPostNotification(UIAccessibilityLayoutChangedNotification, nil);
When I verify in the Simulator using the Accessibility Inspector, everything looks fine: the notification is displayed. But, when I try to test using VoiceOver on a device, nothing seems to be happening. Is there something I'm missing here? How do I know that it works?
I believe that my thinking of how Voiceover works was a bit off. It seems like the UIAccessibilityLayoutChangedNotification isn't something to notify the user, it's to notify UIKit that voiceover elements have changed.
I wound up using a combination of sound effects and strategic use of the UIAccessibilityAnnouncementNotification notification to speak updates to the user.
Instead of using nil use a string. VoiceOver will use that string for speech.

Trouble toggling the userInteractionEnabled property in iOS

I am developing a tic tac toe game for iOS and I am using a combination of UIButtons and UIImageViews to allow for user interaction and to display the moves made. My problem is that the buttons continue to accept user input before the cpu makes it's move, which breaks my game logic. I have made several attempts to toggle the userInteractionEnabled property, but I have only been able to turn it off. The engine that gets everything started in the game is my buttonPressed method. I also toggle the userInteractionEnabled property within this method and therein lies my problem: How do I re-enable the property after disabling user interaction? Is there a method that is called in between events that I can overwrite?
I have searched the web and I have searched through the developer documentation provided by Apple and I found information on the touchesBegan and touchesEnded methods. However, from what I understand, those methods need to be explicitly called which brings me back to my original problem of not being able to call those functions without the user's interaction.
If anyone can help, I would greatly appreciate it! I have been racking my brain over this for the past couple of weeks and I am just not seeing a solution.
I'd think that for a game like tic-tac-toe, calculating the countermove should be so fast that it can be done immediately in response to the first button press. If you're doing something complicated to calculate the next move, like kicking off a thread, you might want to reconsider that.
Let's say, though, that your game is something like chess or go, where coming up with a countermove might take a bit longer. Your view controller should have a method to make a move for the current player, let's call it -makeMove:. Your -buttonPressed action should call that method to make a move for the user. In response, -makeMove: should update the state of the game, switch the current player to the next player. If the new current player is the computer, it should then disable the controls and start the process of calculating the next move. Let's imagine that's done by invoking some NSOperation, so that coming up with the next move is an asynchronous task. Once the operation has come up with a move, it should again invoke -makeMove: (by calling -performSelectorOnMainThread:), which will again update the game state and the current player. This time, though, it should see that the new current player is not the computer, and so it should re-enable the controls.

Resources