Macro Recording in iOS - ios

Is it possible to record set of touch events on iPhone and then playback?
I have searched alot but could not find any answer. if its possible, can anyone explain with an example.
I m not looking for testing purpose. Within my application, instead of creating animation, i just want to record set of events and then want to playback to explain the app flow to the users.
Regards.

Recording is pretty simple. Look at the various "Responding to Touch Events" and "Responding to Motion Events" methods on UIResponder. Just create your own UIView subclass (since UIView inherits from UIResponder) and keep a copy of the events passed into the relevant methods.
Playback is a bit more complicated; there's no way to make UITouch or UIEvent objects (so you can't make a fake event and pass it on to -[UIApplication sendEvent:]). But, there's nothing stopping you from manually parsing an array of Event objects and handling it on your own (aside from it being some kind of ugly code).

There's no built-in macro capability, but you could certainly build that ability into your application. You'll need to do more than just play back events, though. Touches aren't normally visible, but if you're trying to explain how to use your app to the user you'll probably want to have some sort of visual representation for the touches that trigger different responses similar to the way the iOS Simulator uses white dots to represent multiple touches when you hold down the option key.
Assuming that you can solve that problem, two strategies for easily recording user actions come to mind:
Use the Undo Manager: NSUndoManager is already set up to "record" undoable events. If you invest some time into making everything in your app undoable, you could (maybe) perform a set of actions, undo them all to move them to the redo stack, and then save the events in the redo stack as your script.
Use Accessibility: The Accessibility framework sends notifications whenever user interface elements are touched. Your app could use those notifications to create a playback script. You'll still need to write the code to play back the events in the script, though.

You could mirror your application with AirServer and use any screen capture software to make the video.

Related

Delphi - Simulate Click Animation

This is my first post here on stackoverflow so forgive me for anything I'm doing wrong.
I'm making a kind of guide to the users without any computer knowledge of my application where I show him how to use it by signalizing what he should do, more specifically where to click. I want to that by moving a "fake" cursor to the button and simulate a click, and here is where I got my problem, I have to simulate just the animation of the click, and not the event itself but I couldn't find a way to do that, can anyone help me?
What you're describing is exactly what WH_JOURNALPLAYBACK is for. It populates the message queue with the mouse and keyboard messages you want to occur, and the OS interprets them. In your case, activate the playback hook and perform the mouse events necessary for performing a click.
In preparation, you'll probably want to use WH_JOURNALRECORD to discover what messages you need. Once you have them, you can probably winnow them down to a reasonably sized list prior to shipping your product to customers. (In particualr, you'll probably record many more mouse-move messages than you really need.)
In your button's click handler, check whether playback is active. Only perform the rest of the event handler when playback isn't active. That way, your program will behave just as though the button were clicked (including any animation), but it won't execute the real event code.

Getting touches at launch on iOS

On Mac, one can always get the location of the mouse "outside the event stream" (ie, even if you've not subscribed to any delegate methods for mouseUp: et al) by calling [NSEvent mouseLocation].
Is there any way on iOS to get current touch events without listening to touchesBegan:?
I ask because there is at least one situation in which touchesBegan is not called: at app launch, if a touch is already in progress, touchesBegan is never called. In fact, neither are any of the touch-related UIEvent methods called as near as I can tell (nor UIApplication's / UIWindow's sendEvent:, apparently).
I would like to vary the behavior of my app slightly based on whether a touch is in progress at launch. Is there any way to detect an in-progress touch at app launch?
This cannot be done. The simple reason: The touch events don't belong to your app. Each touch belongs to some UI element (or responder). As you already know, this element gets the began, moved, ended, cancelled messages.
This is even true within a properly programmed app: All events regarding one touch are delivered to the very same object. After all, how would another object know what to do with that event, and how should the first object properly finish its expected behavior?
While you can (or could, but probably shouldn't) find a work around within your app, there's just no way for cross-app-touch passings.
And on the Mac you may query the mouse position, but in normal application flow there'll always be a mouse down before you get a mouse up event.
To be honest, I don't see any reason why this would be needed anyway... oh wait... I could split my app icon into several areas... not sure if it would already break privacy laws, though, if you get to know where the user has his icon on screen.
I think you could simply "extend" the application launch. When I had time consuming tasks during my application launch, I used to show the same splash screen with a UIActivityIndicator while the action was being carried out.
You could simply create a NSTimer, wait for about 2 seconds and during this time, check for touches, while the splash screen will still be showing.
To do this, in applicationDidFinishLaunch, push a ViewController that looks exactaly like the splash screen and check for touches in this ViewController. After those 2 seconds, proceed with normal initialisation. This behaviour also helps if you have time consuming tasks during initialisation.
I know, it`s a workaround, but my guess, is that it is not possible to check for touches because application will be working on the main thread and the touches also processes on the main thread. This could happen also because there are no ViewControllers or UIWindow initialised and ready to listen to touches.
Hope it helps.
You might try handling the hitTest:withEvent: instead.
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event;
Since according to apple doc "Returns the farthest descendant of the receiver in the view hierarchy (including itself) that contains a specified point."

Can I support VoiceOver in my Cocos2D-iPhone Game?

I'm making a game where a player reacts to sounds via motion - seeing as the visual element isn't needed to play it, and many play with their eyes closed, it seems a shame to not be fully VoiceOver compatible. I'm currently using Cocos2D-iPhone and CocosDenshion for audio, and am now starting to think about how I'll be building my menu system to choose levels and configure controls.
Is it reasonably easy to support VoiceOver in Cocos2D's menu system, or should I look in to trying to create my menus in UIKit which I have no experience using?
I don't know if Cocos' menu system supports VoiceOver, but if it doesn't, you could probably add the functionality you're looking for yourself without having to delve into a lot of UIKit work. All you need to do is create a UIView subclass which gets added to your main window when your app starts up. Then use the UIAccessibilityContainer protocol and UIAccessibilityPostNotification calls to allow users to interact with your game via VoiceOver.
The UIAccessibilityContainer protocol lets you inform VoiceOver what interface elements are currently on the screen, their labels, their traits, etc. VoiceOver then uses this information to let users swipe between elements and get feedback on them.
When your game changes state, you can change what that protocol sends back and then issue a
UIAccessibilityPostNotification(UIAccessibilityLayoutChangedNotification, nil)
...to inform VoiceOver that the screen layout has changed. And to just speak something via VoiceOver, say when your game state has changed, you can send a different notification to speak some text:
UIAccessibilityPostNotification(UIAccessibilityAnnouncementNotification, #"Achievement unlocked!");
No need to go with UIKit framework you can go with the cocos2d native methods and class to implement this.
For sound option we have SimpleAudioEngine Which can be used. you can distinguish between sound using its ID which is of type ALuint.
ALuint soundEffectID;
//to start
soundEffectID=[[SimpleAudioEngine sharedEngine] playEffect:#"my sound"];
//to stop
[[SimpleAudioEngine sharedEngine] stopEffect:soundEffectID];
You have to mange these effect and I think your problem will be solved.

UILocalNotifcation custom fire event, or struggle on with NSTimer

I have a state-transition problem with NSTimer, of which I find difficult to keep track of during applicationWillResignActive / applicationDidEnterBackground, according to the context of my app.
I was wondering if it might not be a better idea to utilise UILocalNotification, especially given it's background/inactive firing. However, I wanted to know whether we have the ability to provide a custom method to UILocalNotification, of which does not present a dialog box (would damage the whole point of my app). In effect, i'd like to only make use of the timer-fire capabilities of UILocalNotification, and handle the fire event with my own method which does something very "undialog-friendly"
Have checked the ADC docs and it alludes to the dialog being presented every time.
Any advice you can give on this would be appreciated.
thanks
sc.
The dialog box is presented when your app is in the background. But it is not presented when your app is running - instead your app is free to deal with the notification however it sees fit. So it would be perfectly possible to hook it up to a custom method of your own making.
The main reason for this behaviour is a user may not want to go into your app if it's in the background. Of course, with iOS 5 the notification may not be a dialog box - it could be one of the new notification styles.

Trouble toggling the userInteractionEnabled property in iOS

I am developing a tic tac toe game for iOS and I am using a combination of UIButtons and UIImageViews to allow for user interaction and to display the moves made. My problem is that the buttons continue to accept user input before the cpu makes it's move, which breaks my game logic. I have made several attempts to toggle the userInteractionEnabled property, but I have only been able to turn it off. The engine that gets everything started in the game is my buttonPressed method. I also toggle the userInteractionEnabled property within this method and therein lies my problem: How do I re-enable the property after disabling user interaction? Is there a method that is called in between events that I can overwrite?
I have searched the web and I have searched through the developer documentation provided by Apple and I found information on the touchesBegan and touchesEnded methods. However, from what I understand, those methods need to be explicitly called which brings me back to my original problem of not being able to call those functions without the user's interaction.
If anyone can help, I would greatly appreciate it! I have been racking my brain over this for the past couple of weeks and I am just not seeing a solution.
I'd think that for a game like tic-tac-toe, calculating the countermove should be so fast that it can be done immediately in response to the first button press. If you're doing something complicated to calculate the next move, like kicking off a thread, you might want to reconsider that.
Let's say, though, that your game is something like chess or go, where coming up with a countermove might take a bit longer. Your view controller should have a method to make a move for the current player, let's call it -makeMove:. Your -buttonPressed action should call that method to make a move for the user. In response, -makeMove: should update the state of the game, switch the current player to the next player. If the new current player is the computer, it should then disable the controls and start the process of calculating the next move. Let's imagine that's done by invoking some NSOperation, so that coming up with the next move is an asynchronous task. Once the operation has come up with a move, it should again invoke -makeMove: (by calling -performSelectorOnMainThread:), which will again update the game state and the current player. This time, though, it should see that the new current player is not the computer, and so it should re-enable the controls.

Resources