Looping AVPlayer seamlessly - ios

There has been some discussion before about how to loop an AVPlayer's video item, but no 'solution' is seamless enough to provide lag-less looping of a video.
I am developing a tvOS app that has a high-quality 10 second clip of 'scenery' in the background of one of its views, and simply restarting its AVPlayer the 'standard' way (subscribing to NSNotification to catch it) is too jumpy not to notice and detract from user experience.
It seems as though the only way to achieve a truly seamless loop is to manually manage frames, at a lower-level (in OpenGL)...
Despite best efforts to read up on this, and as a novice in manipulating video pipelines, I have not come close enough to a comprehensible solution.
I am aware that external libraries exist to be able to perform this behaviour more easily; most notably GPUImage. However, the app I am developing is for tvOS and therefore has difficulty using quite a lot of the 3rd party iOS libraries in existence, GPUImage included. Another library I have come across is AVAnimator, which provides great functionality for light-weight animation videos, but not for dense, high-quality video clips of source footage encoded in .H264.
The closest I have come so far is Apple's own AVCustomEdit source code, however this primarily deals with static production of a 'transition' that, while seamless, is too complex for me to be able to discern how to make it perform simple looping functionality.
If anybody can chip in with experience of manipulating AVPlayer at a lower level, i.e. with image processing/buffers (or iOS development that doesn't rest on external libraries), I would be incredibly interested to know how I could make a start.

I had the same problem when streaming a video. After playing for the first time, there was a black screen when loading the video for second time. I got rid of the black screen by seeking video to 5ms ahead. It made nearly a seamless video loop. (Swift 2.1)
// Create player here..
let player = AVPlayer(URL: videoURL)
// Add notification block
NSNotificationCenter.defaultCenter().addObserverForName(AVPlayerItemDidPlayToEndTimeNotification, object: player.currentItem, queue: nil)
{ notification in
let t1 = CMTimeMake(5, 100);
player.seekToTime(t1)
player.play()
}

If the video is very short (a few seconds), you can probably extract each frame as CGImage and use CAKeyframeAnimation to animate it. I am using this technique to play GIF images on my app and the animation is very smooth.

You mention that you looked at AVAnimator, but did you see my blog post on this specific subject of seamless looping? I specifically built seamless looping logic in because it could not be done properly with AVPlayer and the H.264 hardware.

I use two AVPlayerItems with the same AVAsset in an AVQueuePlayer and switch the items:
weak var w = self
NSNotificationCenter.defaultCenter().addObserverForName(AVPlayerItemDidPlayToEndTimeNotification, object: nil, queue: nil) { (notification) -> Void in
let queuePlayer = w!.playerController.player! as! AVQueuePlayer
if(queuePlayer.currentItem == playerItem1) {
queuePlayer.insertItem(playerItem2, afterItem: nil)
playerItem1.seekToTime(kCMTimeZero)
} else {
queuePlayer.insertItem(playerItem1, afterItem: nil)
playerItem2.seekToTime(kCMTimeZero)
}
}

Related

Swift change Pitch and Speed of recorded audio

I have an app who's foundation is essentially based on https://blckbirds.com/post/voice-recorder-app-in-swiftui-1/.
It's Swift / XCode 12.5.1 and works great. I call the audio using self.audioPlayer.startPlayback(audio: self.audioURL) which plays the recording perfectly.
Now I want to add the ability for the user to adjust the pitch and speed of the recorded audio. It doesn't have to save the changes, just apply the changes while playing the file on the fly.
I found https://www.hackingwithswift.com/example-code/media/how-to-control-the-pitch-and-speed-of-audio-using-avaudioengine which simplifies the process of applying pitch changes. I'm able to change the startPlayback above to
self.audioPlayer.speedControl.rate = 0.5
do {
try self.audioPlayer.play(self.audioURL)
}
catch let error as NSError {
print(error.localizedDescription)
}
after adding HWS's code into the AudioPlayer class, which proves it's working, but it's not an implementation.. it breaks some of the other capabilities (like updating and using the stopPlayback function), which I think is due to switching between the AVAudioPlayer and the AVAudioPlayerNode I'm trying to figure out if I need to rewrite the AudioPlayer.swift from the blckbirds tutorial, or if there's a friendlier way to incorporate HWS's into the project.
For example, I suppose I could create a toggle that would use the AVAudioPlayer playback if no effects are being used, then if the toggle enables one of the effects, have it use AVAudioPlayerNode instead.. but that seems inefficient. I'd appreciate any thoughts here!
Turns out this was simpler than I had thought using #AppStorage and conditionals to integrate the desired player. Thanks!

Best way to play silence using AVAudioPlayer on iOS

I found myself in a situation where I need to simulate audio playback to trick OS controls and MPNowPlayingInfoCenter into thinking that an audio is being played. This is because I am building a player that plays multiple audio tracks, with pauses in-between creating one, continuous "audio" track. I have already everything setup inside the app itself, and the lock screen controls are working correctly but the only problem I am facing is while the actual audio stops and a pause is being "played", the lock screen info center stops the timer, and it only continues with showing correct time and overall state once another audio track starts playing.
Here is the example of my audio track built from audio files and pause items:
let items: [AudioItem] = [
.audio("part-1.mp3"),
.pause(duration: 5), // value of type: TimeInterval
.audio("part-2.mp3"),
.pause(duration: 3),
... // the list goes on
]
then in my custom player, once AVAudioPlayer finishes its job with current item, I get the next one from the array and play either a .pause with a scheduled Timer or another .audio with AVAudioPlayer.
extension Player: AVAudioPlayerDelegate {
func audioPlayerDidFinishPlaying(_ player: AVAudioPlayer, successfully flag: Bool) {
playNextItem()
}
}
And here lies the problem, once the AVAudioPlayer stops, the Now Playing info center automatically stops too, even tho I keep feeding it fresh nowPlayingInfo. Then when it hits another .audio item, it resumes correctly and shows current time, etc.
And here lies the question
how do I trick the MPNowPlayingInfoCenter into thinking that audio is being played while I "play" my .pause item?
I realise that it may still not be clear, what I am trying to achieve but I am happy to share more insight if needed. Thanks!
Some solutions I am currently thinking about:
A. Keeping 1s long empty audio track that would play on loop for as long as the pause is needed to play.
B. Creating programatically empty audio track with appropriate lenght and playing it instead of using Timer for keeping track of pause duration/progress and relying completely on AVAudioPlayer for both .audio and .pause items. Not sure this is possible though.
C. Maybe there is a way to tell the MPNowPlayingInfoCenter that the audio keeps playing without the need of using AVAudioPlayer but some API I am not familiar with?
AVAudioPlayer is probably the wrong tool here. You want AVAudioPlayerNode, which is slightly lower-level. Create an AVAudioEngine, and attach an AVAudioPlayerNode. You can then call scheduleFile(_:at:completionHandler:) to play the audio at the times you want.
Much of the Apple documentation on AVAudioEngine appears broken right this moment, but the links hopefully will be available again shortly in the links for Audio Engine Building Blocks. (If it stays down and you have trouble finding docs, leave a comment and I'll hunt down the WWDC videos and other tutorials on using AVAudioEngine. It's not particularly difficult for simple problems.)
If you know in advance how you want to compose these items (and it looks like you may), see also AVMutableComposition, which lets you glue together assets very efficiently, including adding empty segments of silence. See Media Composition and Editing for the various tools in that space.

iOS 11.1 UIImagePickerController video crop start time not movable

We are displaying a UIImagePickerController for users to choose (and crop) a video for use within our app. Recently users have been experiencing issues trying to crop videos, with the start time handle becoming almost impossible to drag.
It seems that the Photos app doesn't have this issue because the video timeline (and crop selection) is moved to the bottom of the screen.
I assume this has to do with the new notification centre gestures that were added for the iPhone X. I believe this question here is related to the issue we're experiencing.
Anyone else having this issue, or have a way to get around it? Since this is a stock UIViewController I can't see how we can get around the issue without building our own custom video picker/cropper.
I was having the same issue and I decided to disable the UIImagePickerController editing and catch the video path in the UIImagePIckerController's didFinishPickingMediaWithInfo, and then use UIVideoEditorController to edit the video. This is a quick example:
if UIVideoEditorController.canEditVideo(atPath: videoPath) {
let editController = UIVideoEditorController()
editController.videoPath = videoPath
editController.delegate = self
present(editController, animated:true)
}
For more information about UIVideoEditorController, check Apple's documentation: https://developer.apple.com/documentation/uikit/uivideoeditorcontroller?

SceneKit - Audio causes cxa_throw with a lag the first time I play a sound

I play back a sound like this ( this is inside a SCNNode subclass ):
let audioSource = SCNAudioSource(named: "coin.wav")
let audioPlayer = SCNAudioPlayer(source: audioSource)
self.addAudioPlayer(audioPlayer)
The first time this is called, I get a severe lag and an expection is thrown.
I notice the lag, when I disable the All_Expection_Breakpoint.
What can I do against this?
The C++ exception comes from AVAudioEngine that is used by the SceneKit audio layer. The AVAudio* framework uses C++ exceptions internally so if you have a breakpoint set in Xcode to break when C++ exceptions are thrown Xcode will break a lot in the AVAudio* code (mostly at init times). You can safely ignore these as they are caught by the framework before they reach your code anyway.
If you don't want the lag you can instantiate your audio source and load it at startup time:
let audioSource = SCNAudioSource(named: "coin.wav")
audioSource.load()
And then add the player when you need it later:
let audioPlayer = SCNAudioPlayer(source: audioSource)
self.addAudioPlayer(audioPlayer)
By the way, players are cached and recycled so you don't have to worry too much about memory being used for nothing.
Note also that the SCNAction uses exactly the same API than you do, so if you create an action with a sound that hasn't previously been loaded in memory with .load() you will also get a lag.
Hope this helps,
S.
This is really only for atmospheric, ambient and background environment sorts of sounds, to be looped and moved with a character or attached to the position of a waterfall or something like that.
It's not a performant, instant sound player for immediate sound effects requiring low latency.
For that you're better off using an SCNAction to play the audio as an Action when needed, or using something like Fmod that's designed for low latency sound playback.
I'm not sure how I know this.

NetStream http Video not playing on IOS device

I am trying to play a video on iPad, my code is below :
public function init_RTMP():void
{
videoURL = "http://rest************_iphone_high.mp4";
vid = new Video();
nc = new NetConnection();
nc.addEventListener(NetStatusEvent.NET_STATUS, onConnectionStatus);
nc.connect(null);
}
private function onConnectionStatus(e:NetStatusEvent):void
{
if (e.info.code == "NetConnection.Connect.Success")
{
trace("Creating NetStream");
netStreamObj = new NetStream(nc);
metaListener = new Object();
metaListener.onMetaData = received_Meta;
netStreamObj.client = metaListener;
netStreamObj.play(videoURL);
vid.attachNetStream(netStreamObj);
addChild(vid);
}
}
when i play it on my system it is working fine, but when i create a IOS app of it and installs on device, it shows white blank screen.
If anyone have same problem or any idea please share with me.
As VC.One pointed out, AIR for iOS does not play most (but not all, it will occasionally play a very specific encode type) h.264 encoded videos. There are three solutions:
As VC.One said, you encode as FLV. Doing this is not good and I would not recommend it. FLV is not hardware accelerated (unless things have changed recently and I have not seen the updates) and will run entirely off the CPU meaning your app will run slowly and the app will eat battery much quicker than normal.
Use StageWebView, in which case you just plug in the URL to the video and it will play the video using the native video player. This has the down side in that you cannot skint he player and you cannot control it. Once it begins playing, you have no control over it except for unloading the page. This works very well, however, and is fairly easy to implement, though the video will appear on top of the stage (it is not in the Display List).
The last option is to use StageVideo. This will play videos using the native framework, so you can easily play h.264 and it will be hardware accelerated. Additionally, this is just a NetStream player so you have full control over it. And best yet, it has no chrome so you can build a player around the video screen. However, like StageWebView, StageVideo is not in the Display List. But unlike StageWebView, it is rendered directly on the stage, below everything else. So the app itself will cover the video. You can get around this by creating a class to mask your app around the video, but it is incredibly difficult to properly pull off. It took me about 12 hours to create my StageVideo player and the masking class, plus another half day later on fixing issues with the masking class and how it handles DPI changes (hint: do NOT set applicationDPI if you are using Flex)
As always, make sure your AIR SDK is up to date as well. 3.5-3.7 have all added a ton of new features and bug fixes for iOS applications so updating to AIR 3.7 might actually solve or make your issue less of a problem (I don't think it will, but it is always worth a shot, right?)
See this link:
Netstream video not playing on iPad
Basically it was fixed by encoding the video file as FLV not MP4.

Resources