Audiokit Control interaction capture - audiokit

I'm trying to capture synth pitch/volume/e.t.c changes along the duration of an AKSequence, i.e it records changes to AKMidiNoteData as its playing, Im thinking I should have a slider or wheel on my instrument ui and start capturing slider/wheel positions every 1/256 second or so, once a user has started manipulating them, is that the right way to go about this? Are there existing tutorials for this?

Related

How to record a time-limited video with Adobe AIR for iOS

I am trying to record a time-limited video with Adobe AIR for iOS.
For example, I want to implement the following function. Start a one-minute timer before launching CameraUI to record video. When the timeout event happens after one minute, stop recording video, close the CameraUI view and obtain the video data so far.
I have several questions related to that.
How to stop recording video from outside the CameraUI view(in this case, from the timeout event handler) and then close the CemeraUI view? As far as I know, to close the CameraUI view, the only way is to press the [Use Video] button or the [Cancel] button from inside the CameraUI view. Is it possible to close it from outside?
Even if the first problem mentioned above is solved, then how can I get the video data so far(in this case, the video data before the timeout). I know that normally we can get a MediaPromise object from MediaEvent parameter of the complete handler, and read the video data from the MediaPromise object. But obviously in this case, we can not access the MediaPromise object just because the complete handler itself will not be executed since the [Use Video] button is not pressed.
Is it possible to add a stopwatch to show possible remaining recording time when CameraUI view is open? It seems that the CameraUI automatically uses the full screen of iOS device(in my case, iPad) and there is no extra space to show the stopwatch.
Are there any solutions or workarounds about the three problem above? I really appreciate it if anyone has any idea about this. Thanks in advance.
I never worked with video specially on iOS, so I just putting down my thoughts on this issue, sorry if you find it useless.
I suppose it's impossible to write video outside CameraUI (unless you write your own ANE for that), and I think it's a bad design why do you need that?
Answer the same as 1.
It impossible to add display objects at native windows (once again unless you write your own ANE)
In general if you want more freedom to work with video in AIR you can do it in three ways:
Write you own ANE.
To stream you video data to your own server, and do whatever you want with it.
Least reliable way but you can try it. There is FLVRecorder library, I never tried it and even don't know does it work at all. Or you can try own approach (save you stage to bitmaps with some framerate and then encode it to video). It just suggestion I don't know will it work at all.
Hope my thoughts will help.

Smooth crossfade/transition between two videos?

I have been successfully running the example, Creating a Video Application: http://www.samsungdforum.com/Guide/tut00055/index.html
What I am trying to figure out is how I can smoothly transition from one video to another using a cross fade or any other transition. In this example you are forced to call stop on the player before loading up another video causing a black flash in between.
I've tried going down the path of creating two Video instance one behind eachother and using JQuery to fade out/fade in but I am having a lot of trouble. it appears jquery fades don't even apply to the video element. Also, Is there some kind of limitation with playing two videos at the same time? Is there a better way to go about this? Would it be better to look into doing this using WebGl instead?
Thanks!
If you use Samsung API you can play only one video at time. I can't find documentation page at the moment.
You can try different approaches:
use html5 <video> tag, maybe browser will be capable of playing 2
of them.
use screenshot from second video, and fadeOut it over video
before stopping first video and starting next.
you can recode your videos in 1 continious video in any video editing software. If you need to switch between videos, you can jump to position of next video start.

Screen Capturing IN IOS

I have searched it on net many times but didn't get a proper solution. I want to implement screen recording video in IOS. Let me make it more clear. Suppose I am playing a game and after playing that game , I want to replay that game to check how well I have played the game. So ultimately I want to record every action performed on screen. Switching from one screen to another , coming back to root screen etc. Every action must be recorded. I am tiered of searching it on google. Can anybody tell me how to implement it ?
Any help will highly be appreciated.
Thanks in advance.
this might be helpful checkout
http://codethink.no-ip.org/wordpress/archives/673
Been there - just record the user actions and everything that affects the game. When you want to generate the video, just replay every user action in your world and instead of rendering the UI to the screen, generate video frames.
One of the typical problems is adding audio (music/sound effects) to the video but that would require a separate questions.
If you don't want to generate a video file but only replay the actions on your device, then you can just disable user interaction and replay the user actions directly.

Dim volume of iPodMusicPlayer to 50% in iOS

I know that you can easily set the volume property of the music player, but I want to do it smoothly like Google Maps does when they use the voiceover for navigation instructions.
I was wondering what the best way to do this is.
Thanks!
I would try using a repeating NSTimer. Every time the timer fires you lower the volume a bit. When it reaches the target value you invalidate the timer.
Other ways of getting a repeated event (so that you can do something in stages gradually over time) are DISPATCH_SOURCE_TYPE_TIMER and CADisplayLink. But I think a timer is probably the simplest way to get started.
If you have a pre-existing sound that you're playing, a completely different solution is to apply a fadeout to it before you start playing it (and then just play it all at the same volume, because the sound itself fades out, do you see). AVFoundation gives you the tools to do that (e.g. setVolumeRampFromStartVolume:toEndVolume:timeRange:).

Youtube embeded in delphi autoplay&loop refreshing

I've just made simple YouTube player in SDI using embedded links inside the TWebBrowser component, and managed to create auto play and loop (loop one and loop all) buttons as well as quality changing radio buttons, but they're all based on hyperlink changes, and therefore does not perform a change while video playing, but have to send "Go" command again. I've implemented it with the buttons mentioned above, but it's annoying to restart video when making a change, especially if I just want to set loop somewhere in the middle of the song.
Is there any way of maybe reading current time and set then set link with that time starting, or any other way of solving this, so when pressing the button, the action will be accepted with no refresh, or won't restart the video from beginning?
The default buttons for changing quality within the player itself does exactly that, but can't find a command.
Also, on YouTube there is a button in playlist to loop, as well as randomize, but can't find them either.

Resources