Is it possible to use multiple 'MTAudioProcessingTap's? - ios

In my iOS app, I'm using AVFoundation's AVComposition to create a video, with multiple audio tracks. I am trying to let the user see the volume/power-level for each audio-track. I've successfully implemented this for one track, but as soon as I try to use a second MTAudioProcessingTap, it fails with OSError -12780. In fact, if I use a processingTap and then go 'back' - deallocating the entire view controller, and re-opening that particular window, it won't even attach the processingTap again, even if the first AVPlayer playing the composition has been deallocated. To solve this, I found out from searching that I have to manually release it and clearing out the audioMix for the player, but that's not my problem now. Now, I can't clear out the other processingTap, I need them both!
I'm not completely sure what the MTAudioProcessingTap actually is, as it is the single least documented piece of code ever to come out of the Apple dev-team, by far. I've watched the WWDC, and gone through the iPad-sample-project they have made, but I can't figure out how to have two taps running.
I figured I might not actually need two taps, but maybe use one to handle more than one audioTrack, however if I somehow manage to use the same tap for two AudioTracks, I wouldn't know how to tell the taps apart in the static callbacks. Is there perhaps another way to monitor audio-levels from an AVComposition playing in an AVPlayer, or is there a way like this?

Related

tvOS issue with Siri search deeplinking integration

I have an issue where when user is playing a video , while video is being player, he uses siri to search for a different movie, which will load its corresponding movie details page and then select to play that movie, which deeplinks to your app which is playing a movie, when i play a new selected movie and dismiss avplayer and avplayercontroller, audio from previous video still continues to play. somehow avplayer is not cleared although i clear all subviews from window and initialize its super view controller class again. I am cluless what can i do erase older instance of avplayer. Let me know if anyone has any suggestions or faced similar issue.
A few suggestions:
Are you subclassing AVPlayerViewController? If so, that's a bad idea. The API docs specifically say not to do that.
Add a deinit function. If it's not being called when the old AVPlayer is dismissed, you know you have a retention problem. This is often caused by registering for notifications or boundary time observers.
If you view controller has a reference to the AVPlayer object, you might try overriding the viewDidDisappear function to call player.pause() and then setting the player reference first to a new instance of AVPlayer() then to nil. Not sure why this helps, but sometimes it does.
Definitely implement #2 above. If deinit is not getting called, you most certainly have an issue.

How to record a time-limited video with Adobe AIR for iOS

I am trying to record a time-limited video with Adobe AIR for iOS.
For example, I want to implement the following function. Start a one-minute timer before launching CameraUI to record video. When the timeout event happens after one minute, stop recording video, close the CameraUI view and obtain the video data so far.
I have several questions related to that.
How to stop recording video from outside the CameraUI view(in this case, from the timeout event handler) and then close the CemeraUI view? As far as I know, to close the CameraUI view, the only way is to press the [Use Video] button or the [Cancel] button from inside the CameraUI view. Is it possible to close it from outside?
Even if the first problem mentioned above is solved, then how can I get the video data so far(in this case, the video data before the timeout). I know that normally we can get a MediaPromise object from MediaEvent parameter of the complete handler, and read the video data from the MediaPromise object. But obviously in this case, we can not access the MediaPromise object just because the complete handler itself will not be executed since the [Use Video] button is not pressed.
Is it possible to add a stopwatch to show possible remaining recording time when CameraUI view is open? It seems that the CameraUI automatically uses the full screen of iOS device(in my case, iPad) and there is no extra space to show the stopwatch.
Are there any solutions or workarounds about the three problem above? I really appreciate it if anyone has any idea about this. Thanks in advance.
I never worked with video specially on iOS, so I just putting down my thoughts on this issue, sorry if you find it useless.
I suppose it's impossible to write video outside CameraUI (unless you write your own ANE for that), and I think it's a bad design why do you need that?
Answer the same as 1.
It impossible to add display objects at native windows (once again unless you write your own ANE)
In general if you want more freedom to work with video in AIR you can do it in three ways:
Write you own ANE.
To stream you video data to your own server, and do whatever you want with it.
Least reliable way but you can try it. There is FLVRecorder library, I never tried it and even don't know does it work at all. Or you can try own approach (save you stage to bitmaps with some framerate and then encode it to video). It just suggestion I don't know will it work at all.
Hope my thoughts will help.

multiple MPMoviePlayerController on a tableview

I have multiple MPMoviePlayerController on a UITableView (on different sections).
I know that only one can play on a specific time, but the thing is that if a different player was in "pause" mode, it gets stuck and I need to re-init it.
I can do a sophisticated [tableview reload] on everything else but me - but it seems cycle consuming and idiotic (and not that simple to reload all but me)
Is there a better way? Maybe a 3rd party OS package that does handle this nicely?
Ok, first of all, why you need multiple MPMoviePlayerController instances if you play only one video at a time? You could create one instance of MPMoviePlayerController to play all videos one by one.
And, I think, AVPlayer with AVPlayerLayer would be more extensible solution for playing videos on iOS. Take a look at the AVFoundation framework reference for more information about the AVPlayer here.
Good Luck!

AVQueuePlayer stops playing when I navigate to previous view

I have used AVQueuePlayer several times and the default behaviour is to continue playing when you change view, but in my case when I navigate to the previous view where I came from by segue, the player stops. I put a breakpoint after dealloc to see if the AVQueuePlayer is released and from what I can see it's not deallocated (I have a strong reference to it using property). Please help!!!
I am streaming audio using several links from a server, not playing local files. I created the url, used the url to make AVPlayerItems, and added the player items into array, I used this array to initiate the AVQueuePlayer. I used GCD to make sure my array is completely ready before I play the AVQueuePlayer.
As soon I click on back button it stops. I am pulling out my hair
Create a singleton class that provides an interface to the AVQueuePlayer. That way you will be sure that it's alive even when you pop your view controllers.

AVComposition breaks on Airplay

I have a video composition which I'd like to play over Airplay (without mirroring). The app works as expected when using normal Airplay mirroring, but I'd like to get the speed, reliability, and resolution bump you get from using Airplay video instead.
The problem is that when I set
player.usesAirPlayVideoWhileAirPlayScreenIsActive = YES;
...the player goes blank.
Notes:
Since I don't create separate windows for each display, they are both trying to use the same AVPlayer.
My AVVideoComposition contains different files and adds opacity ramps between them.
This unanswered question suggests that the problem is more likely due to the fact that I'm playing an AVComposition than the use of a shared player: AVComposition doesn't play via Airplay Video
Two questions:
Do I have to get rid of the player on the iPad?
Can an AVVideoComposition ever be played over AirPlay?
I can't make comments so I had to post this as an answer although it might not fully respond to the questions.
I had similar issue and at the end I found out that when AVPlayer plays AVComposition it simply doesn't display anything on the external display. That's why I had to do it myself by listening to UIScreen connection notifications.
I have to say that all worked pretty perfect. I'm checking first if there are more than one screen and if there are I simply move the AVPlayer on that screen while displaying a simple message on the device's screen that content is played on... plus the name of AirPlay device. This way I can put whatever I want on the external display and is not very complicated. Same thing is when I receive UIScreenDidConnectNotification.
That was fine until I noticed that the composition plays really choppy on the the external display. Even if it consists of only one video without any complex edits or overlays. Same video plays perfectly if I save it to the Camera Roll or if I use MPMoviePlayerController.
I've tried many things like lowering resolutions, lowering renderScale and so on but with no success.
One thing bothers me more is how actually Apple do this in iMovie - if you have AirPlay enabled and you play a project (note it's still not rendered so it must use a composition in order to display it) right after tapping play button it opens a player that plays content really smoothly on the external monitor. If you however activate AirPlay from the player it closes and start rendering the project. After that it plays it I thing by using MPMoviePlayerController.
I'm still trying to find a solution and will post back if I have any success.
So for the two questions:
I don't see why you have to get rid.
Yes it can be played but with different technique and obviously issues.
in the app .plist create a new item called:
required background modes
add a new array element called:
App plays audio or streams audio/video using AirPlay
Not sure if you have already tried this, but you don't mention it in your post.
Cheers!

Resources