Syncronizing TMediaPlayer.Position and TTrackBar.Position via LiveBindings - delphi

I have a TTrackBar and a TMediaPlayer, I'm looking for a way to change the TTrackBar position according to the TMediaPlayer position using the LiveBindigs feature.
The problem is, there is no event on the TMediaPlayer to watch the changes of the TMediaPlayer.Position property, so my TTrackBar.Position can't synchronize.
Is it possible to watch the changes of a component property without trigger an event?

Not it is not possible to monitor changes of certain property without suitable event.
And you would not want to have any event binded to MediaPlayer.Position property either. Why?
For instance when you are playing a video position is changed for each and every frame which menans that when playing a video with 30 FPS such event would be fired 30 times per second. So depending on the code in that event it could quickly bring your application to a crawl.
So best suggestion that I can give you is for you to place a timer on your form and then check media player position in certain intervals to update your TrackBar. I believe one second interval would is more than enough but you can make it shorter if you will.
Just make sure that if you also use TrackBar for seeking ability to use some control variable to see whether the TrackBar position is being updated by user or by your Timer. Other vise you will end up with weird stuttering (happened to me the first time).
As for achieving all this with LiveBindings alone I don't think it is possible.

Related

Delphi - Simulate Click Animation

This is my first post here on stackoverflow so forgive me for anything I'm doing wrong.
I'm making a kind of guide to the users without any computer knowledge of my application where I show him how to use it by signalizing what he should do, more specifically where to click. I want to that by moving a "fake" cursor to the button and simulate a click, and here is where I got my problem, I have to simulate just the animation of the click, and not the event itself but I couldn't find a way to do that, can anyone help me?
What you're describing is exactly what WH_JOURNALPLAYBACK is for. It populates the message queue with the mouse and keyboard messages you want to occur, and the OS interprets them. In your case, activate the playback hook and perform the mouse events necessary for performing a click.
In preparation, you'll probably want to use WH_JOURNALRECORD to discover what messages you need. Once you have them, you can probably winnow them down to a reasonably sized list prior to shipping your product to customers. (In particualr, you'll probably record many more mouse-move messages than you really need.)
In your button's click handler, check whether playback is active. Only perform the rest of the event handler when playback isn't active. That way, your program will behave just as though the button were clicked (including any animation), but it won't execute the real event code.

How to prevent Nuance's DragonMobile from turning off VoiceOver?

Nuance's DragonMobile component apparently turns off VoiceOver announcements between the initial call to SKRecognizer's initWithType:detection:language:delegate and the component's call to recognizerDidFinishRecording:. It makes some sense that they do this, since they don't want the VoiceOver announcements to be picked up by the mic and transcribed.
The problem is that there's usually a 1-2 second gap between the initialization of the recognizer and the initial call to recognizerDidBeginRecording:. In order to prevent the user's first few words from getting cut out of the transcription, it's necessary to use recognizerDidBeginRecording: to indicate to the user that they should start speaking (i.e. you can't just have them hit the mic button and start speaking immediately).
My problem is that since DragonMobile turns off VoiceOver as soon as initWithType: is called, I have no way of indicating to a VoiceOver user that they should begin talking at the appropriate time.
Found something of a workaround: DragonMobile allows you to specify SKEarcons, which are audio files that play whenever recording is started, stopped or canceled. I'm going to record VoiceOver making the announcements that I need and then use these recordings as the earcons, so that it will sound like the rest of VoiceOver.
According to a Nuance technical rep I just spoke to, DragonMobile does indeed take over the audio layer and suppress any output during recording, and they don't expose any way around this other than the earcons.

Macro Recording in iOS

Is it possible to record set of touch events on iPhone and then playback?
I have searched alot but could not find any answer. if its possible, can anyone explain with an example.
I m not looking for testing purpose. Within my application, instead of creating animation, i just want to record set of events and then want to playback to explain the app flow to the users.
Regards.
Recording is pretty simple. Look at the various "Responding to Touch Events" and "Responding to Motion Events" methods on UIResponder. Just create your own UIView subclass (since UIView inherits from UIResponder) and keep a copy of the events passed into the relevant methods.
Playback is a bit more complicated; there's no way to make UITouch or UIEvent objects (so you can't make a fake event and pass it on to -[UIApplication sendEvent:]). But, there's nothing stopping you from manually parsing an array of Event objects and handling it on your own (aside from it being some kind of ugly code).
There's no built-in macro capability, but you could certainly build that ability into your application. You'll need to do more than just play back events, though. Touches aren't normally visible, but if you're trying to explain how to use your app to the user you'll probably want to have some sort of visual representation for the touches that trigger different responses similar to the way the iOS Simulator uses white dots to represent multiple touches when you hold down the option key.
Assuming that you can solve that problem, two strategies for easily recording user actions come to mind:
Use the Undo Manager: NSUndoManager is already set up to "record" undoable events. If you invest some time into making everything in your app undoable, you could (maybe) perform a set of actions, undo them all to move them to the redo stack, and then save the events in the redo stack as your script.
Use Accessibility: The Accessibility framework sends notifications whenever user interface elements are touched. Your app could use those notifications to create a playback script. You'll still need to write the code to play back the events in the script, though.
You could mirror your application with AirServer and use any screen capture software to make the video.

How to observe MPMoviePlayerController according to the playback time?

In order to invoke some events for the specified time, for example, when the video goes to 10.0s, or 20.0s , there should be some events to be invoked, is that possible to observe some property of the instance of MPMoviePlayerController for such cases ? Or any other solutions ?
I had to do almost the same thing and I didn't find any good solution except by using my own NSTimer and look at some moments the current time of the MPMoviePlayer...
So basically, every seconds I'm watching what current time it is and when I'm interrested in the time, I do my stuff.

Trouble toggling the userInteractionEnabled property in iOS

I am developing a tic tac toe game for iOS and I am using a combination of UIButtons and UIImageViews to allow for user interaction and to display the moves made. My problem is that the buttons continue to accept user input before the cpu makes it's move, which breaks my game logic. I have made several attempts to toggle the userInteractionEnabled property, but I have only been able to turn it off. The engine that gets everything started in the game is my buttonPressed method. I also toggle the userInteractionEnabled property within this method and therein lies my problem: How do I re-enable the property after disabling user interaction? Is there a method that is called in between events that I can overwrite?
I have searched the web and I have searched through the developer documentation provided by Apple and I found information on the touchesBegan and touchesEnded methods. However, from what I understand, those methods need to be explicitly called which brings me back to my original problem of not being able to call those functions without the user's interaction.
If anyone can help, I would greatly appreciate it! I have been racking my brain over this for the past couple of weeks and I am just not seeing a solution.
I'd think that for a game like tic-tac-toe, calculating the countermove should be so fast that it can be done immediately in response to the first button press. If you're doing something complicated to calculate the next move, like kicking off a thread, you might want to reconsider that.
Let's say, though, that your game is something like chess or go, where coming up with a countermove might take a bit longer. Your view controller should have a method to make a move for the current player, let's call it -makeMove:. Your -buttonPressed action should call that method to make a move for the user. In response, -makeMove: should update the state of the game, switch the current player to the next player. If the new current player is the computer, it should then disable the controls and start the process of calculating the next move. Let's imagine that's done by invoking some NSOperation, so that coming up with the next move is an asynchronous task. Once the operation has come up with a move, it should again invoke -makeMove: (by calling -performSelectorOnMainThread:), which will again update the game state and the current player. This time, though, it should see that the new current player is not the computer, and so it should re-enable the controls.

Resources