multiple MPMoviePlayerController on a tableview - ios

I have multiple MPMoviePlayerController on a UITableView (on different sections).
I know that only one can play on a specific time, but the thing is that if a different player was in "pause" mode, it gets stuck and I need to re-init it.
I can do a sophisticated [tableview reload] on everything else but me - but it seems cycle consuming and idiotic (and not that simple to reload all but me)
Is there a better way? Maybe a 3rd party OS package that does handle this nicely?

Ok, first of all, why you need multiple MPMoviePlayerController instances if you play only one video at a time? You could create one instance of MPMoviePlayerController to play all videos one by one.
And, I think, AVPlayer with AVPlayerLayer would be more extensible solution for playing videos on iOS. Take a look at the AVFoundation framework reference for more information about the AVPlayer here.
Good Luck!

Related

AVPlayer with Streaming videos or NSFileHandle

In my app I need to play multiple videos one after another. Currently, I am streaming the videos using AVPlayer but its seems very laggy, the videos freeze quite often. I'm wondering if downloading the files with NSFileHandle will provide a better user experience with less lagging. BUT, Im worried about memory issues.
Does anyone have any recommendations in which way is more efficient? Or, for example, how snapchat plays such a large number of videos so smoothly. Thanks.
To control the playback of assets, you use an AVPlayer object. During playback, you can use an AVPlayerItem instance to manage the presentation state of an asset as a whole, and an AVPlayerItemTrack object to manage the presentation state of an individual track. To display video, you use an AVPlayerLayer object.
enter link description here

Playing sound without lag in Swift

As many developers know, using AVAudioPlayer for playing sound in games can result in jerky animation/movement, because of a tiny delay each time a sound is played.
I used to overcome this in Objective-C, by using OpenAL through a wrapper class (also in Obj-C).
I now use Swift for all new projects, but I can't figure out how to use my wrapper class from Swift. I can import the class (through a bridging header), but when I need to create ALCdevice and ALCcontext objects in my Swift file, Xcode won't accept it.
Does anyone have or know of a working example of playing a sound using OpenAL from Swift? Or maybe sound without lag can be achieved in some other way in Swift?
I've ran to a delay-type problem once, I hope your problem is the same one I've encountered.
In my situation, I was using Sprite-Kit to play my sounds, using SKAction.playSoundFileNamed:. It would always lag half a second behind where I wanted it to play.
This is because it takes time to allocate memory for each SKAction call. To solve this, store the sound action in a variable so you can reuse the sound later without instantiating new objects. It saved me from the delay. This technique would probably work for AVAudioPlayer too.

Switch between audio tracks in MPMoviePlayerController

In the app I'm currently working on I have some videos with multiple audio tracks (in different languages) and I'd like to have the ability to switch between these tracks programmatically. There is a button that allows the user to switch between them manually (which presents this screen by tapping on it: http://i.stack.imgur.com/mkUTZ.png) so I assumed this wouldn't be too hard. However, after two hours of research I haven't found anything useful. There's this StackOverflow question that suggests it's impossible to play video with multiple audio streams in the first place, but apparently that doesn't stand true anymore. There are also some examples with AVFoundation (I haven't really looked into the), but none with MPMoviePlayer or MPMoviePlayerController.
So I'd like to know if my goal is achievable with stock MPMoviePlayerController and if not, what are the good alternatives to it.

Is it possible to use multiple 'MTAudioProcessingTap's?

In my iOS app, I'm using AVFoundation's AVComposition to create a video, with multiple audio tracks. I am trying to let the user see the volume/power-level for each audio-track. I've successfully implemented this for one track, but as soon as I try to use a second MTAudioProcessingTap, it fails with OSError -12780. In fact, if I use a processingTap and then go 'back' - deallocating the entire view controller, and re-opening that particular window, it won't even attach the processingTap again, even if the first AVPlayer playing the composition has been deallocated. To solve this, I found out from searching that I have to manually release it and clearing out the audioMix for the player, but that's not my problem now. Now, I can't clear out the other processingTap, I need them both!
I'm not completely sure what the MTAudioProcessingTap actually is, as it is the single least documented piece of code ever to come out of the Apple dev-team, by far. I've watched the WWDC, and gone through the iPad-sample-project they have made, but I can't figure out how to have two taps running.
I figured I might not actually need two taps, but maybe use one to handle more than one audioTrack, however if I somehow manage to use the same tap for two AudioTracks, I wouldn't know how to tell the taps apart in the static callbacks. Is there perhaps another way to monitor audio-levels from an AVComposition playing in an AVPlayer, or is there a way like this?

AVAudioPlayer initialization & memory management

I am trying to figure out something -
Every time I use AVAudioPlayer I need to initialize a new AVAudioPlayer object.
I have 10 Sentence objects on a view and I wish to add a "PlaySentence" method to each Sentence object so when the user taps the Sentence the app will play a sound file.
I need this behavior on many views, so I thought adding the method to the object class so I can simply call -
[Sentence playSound];
Since AVAudioPlayer is any way initialized every time I wish to use it I can not see why this should be more expensive operation.
Am I right / is it a good approch for this need and why?
Thanks
Shani
So if I understand you right, you want your Sentence object have a playSound method which sets up an AVAudioPlayer and plays the sound.
You can definitely do it like that, but be aware that if you have a lot of Sentence objects then you will end up creating a lot of AVAudioPlayer objects. But you could stop the high water line being too high by releasing it at after the file has finished playing.
Another way you could do this is to have a method on Sentence to return the URL of the file to play and then just have a single AVAudioPlayer instance in your view controller where you want to play the sounds and set it up each time for the correct file. This would be my personal suggested way of doing it.
I think best solution for you is to use SimpleAudioEngine from Cocos2D library.
It's easy to use and integrates well. As I know, it uses OpenAL and AVAudioPlayer. It has easy interfaces to control background music, effects and works with mp3, wav, m4a and other formats and codecs.
Using it is very easy:
// it's better to preload all effects during app start so they will play immediately when you call "playEffect" method
[[SimpleAudioEngine sharedEngine] preloadEffect:#"sentence.m4a"];
...
// play effect where you need it
[[SimpleAudioEngine sharedEngine] playEffect:soundName];
You can find it here inside Cocos2D library. And you can find all useful information about it here.

Resources