use one player to play mutliple assets with different durations , and looping - ios

I want to use one player to play multiple audios.
The audio could be stream media, or local.
Audio have different duration.
Audios should be play together and looping, it works just like a mix player.
Media could be insert or remove at any time.
let's say there's two medias. media_a is 15sec long. media_b is 30sec long. Media_A should be play again on 16sec.
And now, I'm using two players to play audio medias, which works fine, but's not good while using airplay.
I found out some ways on website:
Using AVMutableCompositionTracks to create a AVPlayerItem
but media_a is not play again on 16sec.
Use the least common multiple duration for medias and create a big playerItem.
I think is not good, and could not be insert or remove easily.
So, plz help me. Any clue could be appreciated。

Related

Concatenate Audio Recordings for Playback in AudioKit

I am trying to create a recording app that has the ability to stop and start a audio recording.
My idea to achieve this is to have AudioKit record and save a new file(.aac) every time the stop button is clicked. Then when it goes to play the full recording back it would essentially concatenate all the different aac’s together.(My understanding is that I can't continue recording to the end of a file once it's saved) Example:
Records three different recordings, in directory folder is [1.acc, 2.acc, 3.acc]. When played the user would think it’s one file.
To achieve this do I use a single AKPlayer or multiple? I would need a single playback slider and also a playback time label, both these would have to correlate to the ‘single concatenation’ file of [1.acc, 2.acc, 3.acc].
This is the first time I have used AudioKit, I really appreciate any advice or solutions to this. Thanks!

Recording output audio with Swift

Is it possible to record output audio in an app using Swift? So, for example, say I'm listening to a podcast, and I want to, within a separate app, record a small segment of the podcast's audio. Is there any way to do that?
I've looked around but have only been able to find information on recording microphone recording and such.
It depends on how you are producing the audio. If the production of the audio is within your control, you can put a tap on the output and record to a file as it plays. The easiest way is with the new AVAudioEngine feature (there are other ways, but AVAudioEngine is basically an easy front end for them).
Of course, if the real problem is to take a copy of a podcast, then obviously all you have to do is download the podcast as opposed to listening to it. Similarly, you could buffer and save streaming audio to a file. There are many apps that do this. But this is not because the device's output is being hijacked; it is, again, because we have control of the sound data itself.
I believe you'll have to write a kernel extension to do that
https://developer.apple.com/library/mac/documentation/Darwin/Conceptual/KEXTConcept/KEXTConceptIOKit/iokit_tutorial.html
You'd have to make your own audio driver to record it
It appears as though
That is how softonic made soundflowerbed.
http://features.en.softonic.com/how-to-record-internal-sound-on-a-mac

Is it possible to add an additional audio track to a video file in iOS?

I'm creating an app where I want the possibility to record a video(and sound), and then play it back while recording audio. After this, I'd be left with a video file (containing audio), and a separate audio file(completely different from the video's audio track).
Is it possible to use AVMutableCompositionTrack to compose a new video file containing one video track and two separate audio tracks, and then using AVAssetExportSession to export this to one single standalone video-file which keeps these audio tracks separated? What I hope to achieve with this is that the user can later watch this video-file and choose if one or both audio tracks should be playing. I know of the possibility to use multiple AVAssets to synchronize playback from different audio/video-files, but I'm wondering if I can create one file containing separable audio tracks, and then later define each audio track as an AVAsset to control the syncing.
I know some video formats/codecs etc have the ability to change audio language, even when it's only one single file. I also know AVFoundation has great support for handling tracks. What I don't know, is if these are compatible to each other: Can AVFoundation handle separate tracks from within one single file? And are there any codecs with support for such tracking for iOS(e.g .mp4, .mov, ...)?
I imagine "problems" would occur if a standard video player tried to watch this resulting movie-file (possibly only playing the video with the first audio track), but I'm thinking since I can already assume there are two (or more) audio tracks, it could be done?
Is this possible in any way?
Yes, it is possible to create a video file with multiple audio tracks by using AVAssetWriterInputGroup. The reference says:
Use this class to associate tracks corresponding to multiple AVAssetWriterInput instances as mutually exclusive to each other for playback or other processing.
For example, if you are creating an asset with multiple audio tracks using different spoken languages—and only one track should be played at a time—group the inputs corresponding to those tracks into a single instance of AVAssetWriterInputGroup and add the group to the AVAssetWriter instance using the AVAssetWriter method add(_:).

Playing Multiple Portions of You Tube Video

I know, we can start playing a youtube video at any minute as Youtube provides the api.
What I want is, I want to play multiple portions of the video. For example in a video I want to play from minute 1 to 2, then again from minute 5 to 10 and again from minute 43 to 54.
Is there an API or tool for doing that.
One thing I thought was creating different videos & put them as a play list, but not sure how faster will it be.
Let me know if there is a way to do this.
Thanks in Advance.
There is the Youtube Player API.
#Getting started shows an example of a video being played for 6 seconds from the beginning before it stops.
#seekTo allows advancement of video playback.
You may also use #loadVideoById to pass in the start time and end time via the object syntax, although this method may require the loading of the same video multiple times.

Single AVPlayer with both streaming and non-streaming content

I'm building a video player that should handle both streaming and non-streaming content and I want it to be playable with AirPlay.
I'm currently using multiple AVPlayer instances (one for each clip), and it works okay, but the problem is it doesn't give a very smooth experience when using AirPlay. The interface jumps back and forth between each clip when switching AVPlayer, so I would like to migrate to using a single AVPlayer. This seems like a trivial task, but I haven't yet found a way to do this.
This is what I've tried so far:
Using a single AVPlayer with multiple AVPlayerItems and switching between those using replaceCurrentItemWithPlayerItem. This works fine when switching between streaming->streaming clips or non-streaming->non-streaming, but AVPlayer doesn't seem to accept replacements between streaming->non-streaming or vice versa. Basically, nothing happens when I try to switch.
Using an AVQueuePlayer with multiple AVPlayerItems fails for the same reason as above.
Using a single AVPlayer with a single AVPlayerItem based on an AVMutableComposition asset. This doesn't work because streaming content is not allowed in an AVMutableComposition (and AVURLAssets created from a streaming url doesn't have any AVAssetTracks and they are required).
So is there anything I am missing? Any other suggestion on how to accomplish this?
I asked this question to Apple's Technical Support and got the answer that it's currently not possible to avoid the short jump back to menu interface, and that no version of AVPlayer supports mixing of streaming and non-streaming content.
Full response:
This is in response to your question about how to avoid the short jump back to the main interface when switching AVPlayers or AVPlayerItems for different media items while playing over AirPlay.
The issue here is the same with AVPlayer and AVQueuePlayer: no instance of AVPlayer (regardless of which particular class) can currently play both streaming and non-streaming content; that is, you can't mix HTTP Live Streaming media (e.g. .m3u8) with non-streaming media (a file-based resource such as an .mp4 file).
And with regard to AVMutableComposition, it does not allow streaming content.
Currently, there is no support for "seamless" video playback across multiple items. I encourage you to file an enhancement request for this feature using the Apple Bug Reporter (http://developer.apple.com/bugreporter/).
AVComposition is probably the best option at present for "seamless" playback. However, it has the limitation just described where streaming content is not allowed.

Resources