Playing and caching a remote asset with AVPlayer - ios

I am trying to use AVPlayer to play/cache a remote asset using two tools on Github: CachingPlayerItem with Cache. I found the solution elsewhere(scroll down), which nearly gets me there, My issue now is that I have to tap twice on the remote audio asset (a hyperlink in Firebase) to get it to stream. For some mysterious reason, AVPlayer will not play the remote asset unless it is cached in my case. I am aware that I can directly stream the url using AVPlayerItem(url:) but that is not the solution I am seeking; the sample code for CachingPlayerItem say that should not be necessary.
In my tinkering, I think something is happening with the async operations that are performed when I call playerItem.delegate = self. Maybe I am misunderstanding how this asynchronous delegate operation is working... Any clarity and pointers would be appreciated.
import AVKit
import Cache
class AudioPlayer: AVPlayer, ObservableObject, AVAudioPlayerDelegate {
let diskConfig = DiskConfig(name: "DiskCache")
let memoryConfig = MemoryConfig(expiry: .never, countLimit: 10, totalCostLimit: 10)
lazy var storage: Cache.Storage<String, Data>? = {
return try? Cache.Storage(diskConfig: diskConfig, memoryConfig: memoryConfig, transformer: TransformerFactory.forData())
}()
/// Plays audio either from the network if it's not cached or from the cache.
func startPlayback(with url: URL) {
let playerItem: CachingPlayerItem
do {
let result = try storage!.entry(forKey: url.absoluteString)
// The video is cached.
playerItem = CachingPlayerItem(data: result.object, mimeType: "audio/mp4", fileExtension: "m4a")
} catch {
// The video is not cached.
playerItem = CachingPlayerItem(url: url)
}
playerItem.delegate = self // Seems to be the problematic line if the result is not cached.
self.replaceCurrentItem(with: playerItem) // This line is different from what you do. The behaviour doesnt change whether I have AVPlayer as private var.
self.automaticallyWaitsToMinimizeStalling = false
self.play()
}
}
extension AudioPlayer: CachingPlayerItemDelegate {
func playerItem(_ playerItem: CachingPlayerItem, didFinishDownloadingData data: Data) {
// Video is downloaded. Saving it to the cache asynchronously.
storage?.async.setObject(data, forKey: playerItem.url.absoluteString, completion: { _ in })
print("Caching done!")
}
}

Related

How can I record a video and synchronize it with an AVPlayer?

I need to record a video and show a video with an AVPlayer at the same time. The result needs to be synchronized: if the video I'm showing is a metronome, or someone clapping, the recorded video and the player should click or clap at the same time.
I play the AVPlayer using preroll and setRate, with a delegate in the camera controller.
Using AVCaptureMovieFileOutput, I've tried calling setRate on the player once fileOutput(output:didStartRecordingTo) is called, but the videos end up desynchronized, with the recorded video going behind the player.
Using an AssetWriter, I've tried calling setRate on the player once captureOutput(captureOutput:sampleBuffer) is called, and the first buffer is appended, but the result is the same.
Is there any other way to do this?
EDIT: Adding some code to show what I'm doing:
Camera with AVCaptureMovieFileOutput:
func startRecordingVideo() {
guard let movieFileOutput = self.movieFileOutput else {
return
}
sessionQueue.async {
if !movieFileOutput.isRecording {
let movieFileOutputConnection = movieFileOutput.connection(with: .video)
movieFileOutput.setOutputSettings([AVVideoCodecKey: AVVideoCodecH264], for: movieFileOutputConnection!)
let outputFileName = NSUUID().uuidString
let outputFilePath = (NSTemporaryDirectory() as NSString).appendingPathComponent((outputFileName as NSString).appendingPathExtension("mov")!)
movieFileOutput.startRecording(to: URL(fileURLWithPath: outputFilePath), recordingDelegate: self)
} else {
movieFileOutput.stopRecording()
}
}
}
func fileOutput(_ output: AVCaptureFileOutput, didStartRecordingTo fileURL: URL, from connections: [AVCaptureConnection]) {
cameraDelegate?.startedRecording()
}
Implementing delegate's startRecording in the viewController:
private func setRateForAll() {
let hostTime = CMClockGetTime(CMClockGetHostTimeClock())
activePlayers.forEach {
$0.setRate(atHostTime: hostTime)
}
}
Once the active player or players end, I call stopRecording on the camera, and show the resulting recorded video in another player, along with the player or players I've been showing.
I ended up using the recording session master clock as the hostTime for the setRate method. The desynchronization is almost imperceptible now!
In fileOutput(output:didStartRecordingTo):
cameraDelegate?.startedRecording(clock: CMClockGetTime(self.session.masterClock!))
In the view with the players:
private func setRateForAll(clock: CMTime) {
activePlayers.forEach {
$0.setRate(atHostTime: clock)
}
}

Swift 4 - Trying to display AWS video url from backend in AVPlayerViewController not working, get weird error

I have a tableview which sets a UIImage to hold either an image url from AWS or a thumbnail generated from a video URL also from AWS. The video url refuses to display in my tableview and it throws this error in the debugger.
2017-12-29 12:20:37.053337-0800 VideoFit[3541:1366125] CredStore - performQuery - Error copying matching creds. Error=-25300, query={
class = inet;
"m_Limit" = "m_LimitAll";
"r_Attributes" = 1;
sync = syna;
}
When I click the cell to display either the image or video url, it segues to the video player correctly and then I get the error again and the triangular start button has a line through it signifying that there is no video to be played.
But when I print the url it has successfully passed so that is not the issue, the issue is AVPlayer can't handle my AWS video url for some reason. It is an https link so that means it must be secure but I already set my arbitrary loads to true
<key>NSAppTransportSecurity</key>
<dict>
<key>NSAllowsArbitraryLoads</key>
<true/>
</dict>
Here is some code for displaying my videos in the videoPlayer VC and also the thumbnail generator function, perhaps there is some issue lying with these?
import UIKit
import AVKit
import MediaPlayer
class VideoPlayerVC: AVPlayerViewController {
var urlToPlay: String?
override func viewDidLoad() {
super.viewDidLoad()
print("Here is the url ---> \(String(describing: urlToPlay))")
playVideo()
}
private func playVideo() {
guard let urlFromString = urlToPlay else { print("No url to play") ;return }
let url = URL(string: urlFromString)
print("Here is the url to play ---> \(String(describing: url))")
let asset: AVURLAsset = AVURLAsset(url: url!)
let item: AVPlayerItem = AVPlayerItem(asset: asset)
let player: AVPlayer = AVPlayer(playerItem: item)
self.player = player
self.showsPlaybackControls = true
self.player?.play()
}
}
This is how I make the thumbnail, when my cellforRow atIndexPath method runs it throws this error for every video object in the tableview.
override func tableView(_ tableView: UITableView, cellForRowAt indexPath: IndexPath) -> UITableViewCell {
if let cell = tableView.dequeueReusableCell(withIdentifier: "sortedExerciseCell") as? SortedExerciseCell! {
// Do if check for videoURI and imageURI
if selectedExerciseArray[indexPath.row].imageURI != nil {
if let imageURI = URL(string: selectedExerciseArray[indexPath.row].imageURI!) {
print("It's a photo!")
// Using DispatchQueue.global(qos: .background).async loads cells in background
DispatchQueue.global(qos: .background).async {
let data = try? Data(contentsOf: imageURI)
DispatchQueue.main.async {
cell.exerciseImg.image = UIImage(data: data!)
}
}
}
} else {
if let videoURI = URL(string: selectedExerciseArray[indexPath.row].videoURI!) {
print("It's a video!")
print(videoURI)
DispatchQueue.global(qos: .background).async {
DispatchQueue.main.async {
cell.exerciseImg.image = self.thumbnailForVideoAtURL(url: videoURI)
// for every video object the above error is thrown!!!
}
}
}
}
cell.exerciseName.text = selectedExerciseArray[indexPath.row].name
return cell
} else {
return UITableViewCell()
}
}
// Used to just display the video thumbnail in cell, when clicked on will display video as needed
private func thumbnailForVideoAtURL(url: URL) -> UIImage? {
let asset = AVAsset(url: url)
let assetImageGenerator = AVAssetImageGenerator(asset: asset)
do {
print("Doing the video thumbnail from backend")
let imageRef = try assetImageGenerator.copyCGImage(at: CMTimeMake(1, 60) , actualTime: nil)
return UIImage(cgImage: imageRef)
} catch {
print("This is failing for some reason")
print("error")
return nil
}
}
I've looked at similar questions on stack overflow about this but none seem to give a complete answer on how to solve this problem, most chalking it up to an iOS 11 bug that can't be fixed or transport security (Already have arbitrary loads on so this can't be the issue..) anyone else have any approaches that might work?
Important note - My backend developer can only view the video url's from a webpage, in other words he must make a basic website to display the video after downloading it from the web. I'm not sure if this is standard procedure for handling video url's from AWS but I decided to try loading the url with "loadHTMLString" in a custom UIViewController conforming to WKUIDelegate, I see the video but the same situation happens where the triangular start button is crossed out signifying no video can be played. I'm not really sure what else I can try at this moment, any help is appreciated.
Here is one of the links I've pulled from my backend.
https://videofitapptestbucket.s3.us-west-2.amazonaws.com/100001427750
It seems that your problem is in file extension. DMS is not recognized by OS, it throws when creating image with assetgenerator.
Error Domain=AVFoundationErrorDomain Code=-11828 "Cannot Open" UserInfo={NSLocalizedFailureReason=This media format is not supported., NSLocalizedDescription=Cannot Open, NSUnderlyingError=0x6040000534d0 {Error Domain=NSOSStatusErrorDomain Code=-12847 "(null)"}}
Rename your files on server. The mp4 extension seems to work just fine with your file.
Also, subclassing AVPlayerViewController is generally not that great idea as Apple says that it will result in undefined behavior. (read: random mess). I would suggest to use class that have AVPlayer inside.
Try:
if let path = urlToPlay {
let url = URL(string: path)!
let videoPlayer = AVPlayer(url: url)
self.player = videoPlayer
DispatchQueue.main.async {
self.player?.play()
}
}
Call it in viewDidAppear() method

Swift and AVFoundation: Playing audio while animating a label

I have an app that I'm adding sounds to. It has a keypad and when the user taps a button, the number animates to show the user that their press went through.
However, since both are happening on the main thread, adding the following code below, the play() function causes a slight delay in the animation. If the user waits ~2 seconds and taps a keypad number again, they see the delay again. So as long as they're hitting keypad numbers under 2s, they don't see another lag.
I tried wrapping the code below in a DispatchQueue.main.async {} block with no luck.
if let sound = CashSound(rawValue: "buttonPressMelody\(count)"),
let url = Bundle.main.url(forResource: sound.rawValue, withExtension: sound.fileType) {
self.audioPlayer = try? AVAudioPlayer(contentsOf: url)
self.audioPlayer?.prepareToPlay()
self.audioPlayer?.play()
}
How can I play this audio and have the animation run without them interfering and with the audio coinciding with the press?
Thanks
I experienced a rather similar problem in my SwiftUI app, and in my case the solution was in proper resource loading / AVAudioPlayer initialization and preparing.
I use the following function to load audio from disk (inspired by ImageStore class from Apple's Landmarks SwiftUI Tutorial)
final class AudioStore {
typealias Resources = [String:AVAudioPlayer]
fileprivate var audios: Resources = [:]
static var shared = AudioStore()
func audio(with name: String) -> AVAudioPlayer {
let index = guaranteeAudio(for: name)
return audios.values[index]
}
static func loadAudio(with name: String) -> AVAudioPlayer {
guard let url = Bundle.main.url(forResource: name, withExtension: "mp3") else { fatalError("Couldn't find audio \(name).mp3 in main bundle.") }
do { return try AVAudioPlayer(contentsOf: url) }
catch { fatalError("Couldn't load audio \(name).mp3 from main bundle.") }
}
fileprivate func guaranteeAudio(for name: String) -> Resources.Index {
if let index = audios.index(forKey: name) { return index }
audios[name] = AudioStore.loadAudio(with: name)
return audios.index(forKey: name)!
}
}
In the view's init I initialize the player's instance by calling audio(with:) with proper resource name.
In onAppear() I call prepareToPlay() on view's already initialized player with proper optional unwrapping, and finally
I play audio when the gesture fires.
Also, in my case I needed to delay the actual playback by some 0.3 seconds, and for that I despatched it to the global queue. I should stress that the animation with the audio playback was smooth even without dispatching it to the background queue, so I concluded the key was in proper initialization and preparation. To delay the playback, however, you can only utilize the background thread, otherwise you will get the lag.
DispatchQueue.global(qos: .userInitiated).asyncAfter(deadline: .now() + 0.3) {
///your audio.play() analog here
}
Hope that will help some soul out there.

iOS Action Extension share from GarageBand

I am building an action extension for GarageBand on iOS which transforms and uploads audio but no matter what I try, I just could not get to the exported file.
Let’s consider the following code — it should:
find and load shared audio from extensionContext
initialise audio player
play the sound
It works if I run the extension in Voice Memos.app — the url to the file looks like this: file:///tmp/com.apple.VoiceMemos/New%20Recording%202.m4a
Now, If I run the code in GarageBand.app, the url points to (what I presume) is GarageBand’s app container, as the url looks something like /var/…/Containers/…/Project.band/audio/Project.m4a, and the audio will not be loaded and cannot therefore be manipulated in any way.
// edit: If I try to load contents of the audio file, it looks like the data only contains aac header (?) but the rest of the file is empty
What is interesting is this: The extension renders a react-native view and if I pass the view the fileUrl (/var/…Project.band/audio/Project.m4a) and then pass it down to XMLHTTPRequest, the file gets uploaded. So it looks like the file can be accessed in some way?
I’m new to Swift/iOS development so this is kind of frustrating for me, I feel like I tried just about everything.
The code:
override func viewDidLoad() {
super.viewDidLoad()
var audioFound :Bool = false
for inputItem: Any in self.extensionContext!.inputItems {
let extensionItem = inputItem as! NSExtensionItem
for attachment: Any in extensionItem.attachments! {
print("attachment = \(attachment)")
let itemProvider = attachment as! NSItemProvider
if itemProvider.hasItemConformingToTypeIdentifier(kUTTypeAudio as String) {
itemProvider.loadItem(forTypeIdentifier: kUTTypeAudio as String,
options: nil, completionHandler: { (audioURL, error) in
OperationQueue.main.addOperation {
if let audioURL = audioURL as? NSURL {
print("audioUrl = \(audioURL)")
// in our sample code we just present and play the audio in our app extension
let theAVPlayer :AVPlayer = AVPlayer(url: audioURL as URL)
let theAVPlayerViewController :AVPlayerViewController = AVPlayerViewController()
theAVPlayerViewController.player = theAVPlayer
self.present(theAVPlayerViewController, animated: true) {
theAVPlayerViewController.player!.play()
}
}
}
})
audioFound = true
break
}
}
if (audioFound) {
break
}
}
}

How to stream audio for only a known duration using swift

I'm using AVPlayer (I don't need to, but I wanna stream it and start playing as soon as possible) to play an m4a file (it's an iTunes audio preview). Only I only want it to play a part of that file.
I'm able to set a start time but not an end time.
Using a timer is not working because I'm using URL as a http address. I'm playing as it loads, without downloading the file.
I also saw solutions in Objective-C to use KVO to know when music starts playing but I'm thinking this is not the best approach since I'm using swift and also because of glitches that may occur so the song will not stop at the right moment.
You can add a addBoundaryTimeObserverForTimes to your AVPlayer as follow:
update: Xcode 8.3.2 • Swift 3.1
import UIKit
import AVFoundation
class ViewController: UIViewController {
var player: AVPlayer!
var observer: Any!
override func viewDidLoad() {
super.viewDidLoad()
guard let url = URL(string: "https://www.example.com/audio.mp3") else { return }
player = AVPlayer(url: url)
let boundary: TimeInterval = 30
let times = [NSValue(time: CMTimeMake(Int64(boundary), 1))]
observer = player.addBoundaryTimeObserver(forTimes: times, queue: nil) {
[weak self] time in
print("30s reached")
if let observer = self?.observer {
self?.player.removeTimeObserver(observer)
}
self?.player.pause()
}
player.play()
print("started loading")
}
}

Resources