AVFoundation over Cellular Data - ios

When I use this code below and I pull my https video link from Firebase over Wifi everything is smooth, the video immediately plays with zero issues. When I use this same code over Cellular everything moves extremely slow, like the video pauses and takes forever to load.
If it plays from file wether I'm on Cellular or Wifi shouldn't matter. What is the issue here?
DataModel:
class Video {
var httpsStr: String?
var videoURL: URL?
convenience init(dict: [String: Any] {
self.init()
if let httpsStr = dict["httpsStr"] as? String {
self.httpsStr = httpsStr
let url = URL(string: httpsStr)!
let assetKeys = [ "playable", "duration"]
let asset = AVURLAsset(url: url)
asset.loadValuesAsynchronously(forKeys: assetKeys, completionHandler: {
DispatchQueue.main.async {
self.videoURL = asset.url
// save videoURL to FileManager to play video from disk
}
})
}
}
}
Firebase Pull:
ref.observeSingleEvent(of: .value) { (snapshot) in
guard let dict = snapshot.value as? [String: Any] else { return }
let video = Video(dict: dict)
self.video = video
DispatchQueue.main.asyncAfter(deadline: .now() + 2, execute: {
self.playVideo()
}
}
Play Video:
func playVideo() {
// init AVPlayer ...
guard let videoURL = self.video.videoURL else { return }
let lastPathComponent = videoURL.lastPathComponent
let file = FileManager...appendingPathComponent(lastPathComponent)
if FileManager.default.fileExists(atPath: file.path) {
let asset = AVAsset(url: file)
play(asset)
} else {
let asset = AVAsset(url: videoURL)
play(asset)
}
}
func play(_ asset: AVAsset) {
self.playerItem = AVPlayerItem(asset: asset)
self.player?.automaticallyWaitsToMinimizeStalling = false // I also set this to true
self.playerItem?.preferredForwardBufferDuration = TimeInterval(1)
self.player?.replaceCurrentItem(with: playerItem!)
// play video
}

I followed this answer and now everything seems to work smoothly while on Cellular Data. I needed to include the tracks property in the assetKeys.
You create an asset from a URL using AVURLAsset. Creating the asset,
however, does not necessarily mean that it’s ready for use. To be
used, an asset must have loaded its tracks.
class Video {
var httpsStr: String?
var videoURL: URL?
convenience init(dict: [String: Any] {
self.init()
if let httpsStr = dict["httpsStr"] as? String {
self.httpsStr = httpsStr
let url = URL(string: httpsStr)!
let assetKeys = ["playable", "duration", "tracks"] // <----- "tracks" added here
let asset = AVURLAsset(url: url)
asset.loadValuesAsynchronously(forKeys: assetKeys, completionHandler: {
var error: NSError? = nil
let status = asset.statusOfValue(forKey: "tracks", error: &error)
switch status {
case .loaded:
// Sucessfully loaded, continue processing
DispatchQueue.main.async {
self.videoURL = asset.url
// save videoURL to FileManager to play video from disk
}
case .failed:
// Examine NSError pointer to determine failure
print("Error", error?.localizedDescription as Any)
default:
// Handle all other cases
print("default")
}
})
}
}
}

Related

AVPlayer playing wrong video file

I am having a weird situation and have no clue how to handle this , I am downloading the videos from firestorage and caching into device for future use , meanwhile the background thread is already doing its job , I am passing a video url to the function to play the video. The issue is that sometimes avplayer is playing the right video and sometimes taking some other video url from the cache.
you can find the code in below :
func cacheVideo(for exercise: Exercise) {
print(exercise.imageFileName)
guard let filePath = filePathURL(for: exercise.imageFileName) else { return }
if fileManager.fileExists(atPath: filePath.path) {
// print("already exists")
} else {
exercise.loadRealURL { (url) in
print(url)
self.getFileWith(with: url, saveTo: filePath)
}
}
}
writing file here
func getFileWith(with url: URL, saveTo saveFilePathURL: URL) {
DispatchQueue.global(qos: .background).async {
print(saveFilePathURL.path)
if let videoData = NSData(contentsOf: url) {
videoData.write(to: saveFilePathURL, atomically: true)
DispatchQueue.main.async {
// print("downloaded")
}
} else {
DispatchQueue.main.async {
let error = NSError(domain: "SomeErrorDomain", code: -2001 /* some error code */, userInfo: ["description": "Can't download video"])
print(error.debugDescription)
}
}
}
}
now playing the video using this
func startPlayingVideoOnDemand(url : URL) {
activityIndicatorView.startAnimating()
activityIndicatorView.isHidden = false
print(url)
let cachingPlayerItem = CachingPlayerItem(url: url)
cachingPlayerItem.delegate = self
cachingPlayerItem.download()
// cachingPlayerItem.preferredPeakBitRate = 0
let avasset = AVAsset(url: url)
let playerItem = AVPlayerItem(asset: avasset)
let player = AVPlayer(playerItem: playerItem)
player.automaticallyWaitsToMinimizeStalling = false
initializeVideoLayer(for: player)
}
any suggestions would be highly appreciated.
this was solved because the data model which i was using to download bunch of videos files was accessed in background thread and meanwhile i was trying to assign the url to the same data model class in order to fetch the video and play in avplayer. Hence this was the issue and resolved by simply adding a new attribute into data model for assigning the url to play right away.

How do I asynchronously download and cache videos for use in my app?

I know that SDWebImage loads the image in a background thread so you're not blocking the UI/main thread when this downloading is going on. Furthermore, it will also disk-cache all the images you've downloaded and will NEVER re-download an image from the same URL.
So I wonder if there is something similar or the same for videos?
Something to note: I add Videos as Sublayer.
let videoURL = URL(string: postArray[indexPath.item].media[0].videoURLString!)//need to do error handlin here
print(videoURL as Any, "<-- video url in dispkay")
let player = AVPlayer(url: videoURL! as URL)
let playerLayer = AVPlayerLayer(player: player)
playerLayer.frame = CGRect(x: -8, y: 0, width: 138, height: 217)//cell.frame
cell.imageOrVideoView.layer.addSublayer(playerLayer)
//Other code and play()
This was recommended in the past but it seems like it does something different or at the very leased has too much extra functionality I dont need.
Update:
What I am testing:
DispatchQueue.global(qos: .default).async(execute: {
var downloadedData: Data? = nil
if let url = URL(string: videoURL) {
do {
downloadedData = try Data(contentsOf: url)
} catch {
print(error, "downloaded Data failed")
}
}
if downloadedData != nil {
// STORE IN FILESYSTEM
var cachesDirectory = NSSearchPathForDirectoriesInDomains(.cachesDirectory, .userDomainMask, true)[0]
var file = URL(fileURLWithPath: cachesDirectory).appendingPathComponent(videoURL).absoluteString
do {
try downloadedData?.write(to: URL(string: file)!)
} catch {
print(error, "error dowloading data and writing it")
}
// STORE IN MEMORY
if let downloadedData = downloadedData {
memoryCache?.setObject(downloadedData as AnyObject, forKey: videoURL as AnyObject)
}
}
// NOW YOU CAN CREATE AN AVASSET OR UIIMAGE FROM THE FILE OR DATA
})
I do not understand however if I should do something right after the last line or if I should do it after the }) or if I need to add a Update UI there.
So I was able to solve the problem with the following:
Swift 4:
import Foundation
public enum Result<T> {
case success(T)
case failure(NSError)
}
class CacheManager {
static let shared = CacheManager()
private let fileManager = FileManager.default
private lazy var mainDirectoryUrl: URL = {
let documentsUrl = self.fileManager.urls(for: .cachesDirectory, in: .userDomainMask).first!
return documentsUrl
}()
func getFileWith(stringUrl: String, completionHandler: #escaping (Result<URL>) -> Void ) {
let file = directoryFor(stringUrl: stringUrl)
//return file path if already exists in cache directory
guard !fileManager.fileExists(atPath: file.path) else {
completionHandler(Result.success(file))
return
}
DispatchQueue.global().async {
if let videoData = NSData(contentsOf: URL(string: stringUrl)!) {
videoData.write(to: file, atomically: true)
DispatchQueue.main.async {
completionHandler(Result.success(file))
}
} else {
DispatchQueue.main.async {
let error = NSError(domain: "SomeErrorDomain", code: -2001 /* some error code */, userInfo: ["description": "Can't download video"])
completionHandler(Result.failure(error))
}
}
}
}
private func directoryFor(stringUrl: String) -> URL {
let fileURL = URL(string: stringUrl)!.lastPathComponent
let file = self.mainDirectoryUrl.appendingPathComponent(fileURL)
return file
}
}
Usage:
CacheManager.shared.getFileWith(stringUrl: videoURL) { result in
switch result {
case .success(let url):
// do some magic with path to saved video
break;
case .failure(let error):
// handle errror
print(error, " failure in the Cache of video")
break;
}
}

Swift won't play audio from link - iOS

I am using Subsonic so mp3 files are served to me via a webservice.
When I test using files that have a .mp3 extension this code works. When I use it with the link I have below it does not.
var player: AVPlayer!
override func viewDidLoad() {
super.viewDidLoad()
let url = URL(string: "http://192.168.1.74:4040/rest/download?u=admin&p=admin&v=1.12.0&c=myapp&id=114&format=mp3")!
let playerItem = CachingPlayerItem(url: url)
playerItem.delegate = self
player = AVPlayer(playerItem: playerItem)
player.automaticallyWaitsToMinimizeStalling = false
player.play()
}
}
Browsing the link in 'url' provides me with the file I'd expect. I have also successfully downloaded the file saved it as MP3 and played it from my documents container, this is not how I want the application to work though.
TL:DR how can I get my application to play audio from a rest API without an extension
You can use AVAssetResourceLoader to play audio without extension.
Here is an example.
First, configure the delegate of resourceloader
var playerAsset: AVAsset!
if fileURL.pathExtension.count == 0 {
var components = URLComponents(url: fileURL, resolvingAgainstBaseURL: false)!
components.scheme = "fake" // make custom URL scheme
components.path += ".mp3"
playerAsset = AVURLAsset(url: components.url!)
(playerAsset as! AVURLAsset).resourceLoader.setDelegate(self, queue: DispatchQueue.global())
} else {
playerAsset = AVAsset(url: fileURL)
}
let playerItem = AVPlayerItem(asset: playerAsset)
then, read audio's data and responds to the resource loader
// MARK: - AVAssetResourceLoaderDelegate methods
func resourceLoader(_ resourceLoader: AVAssetResourceLoader, shouldWaitForLoadingOfRequestedResource loadingRequest: AVAssetResourceLoadingRequest) -> Bool {
if let url = loadingRequest.request.url {
var components = URLComponents(url: url, resolvingAgainstBaseURL: false)!
components.scheme = NSURLFileScheme // replace with the real URL scheme
components.path = String(components.path.dropLast(4))
if let attributes = try? FileManager.default.attributesOfItem(atPath: components.url!.path),
let fileSize = attributes[FileAttributeKey.size] as? Int64 {
loadingRequest.contentInformationRequest?.isByteRangeAccessSupported = true
loadingRequest.contentInformationRequest?.contentType = "audio/mpeg3"
loadingRequest.contentInformationRequest?.contentLength = fileSize
let requestedOffset = loadingRequest.dataRequest!.requestedOffset
let requestedLength = loadingRequest.dataRequest!.requestedLength
if let handle = try? FileHandle(forReadingFrom: components.url!) {
handle.seek(toFileOffset: UInt64(requestedOffset))
let data = handle.readData(ofLength: requestedLength)
loadingRequest.dataRequest?.respond(with: data)
loadingRequest.finishLoading()
return true
} else {
return false
}
} else {
return false
}
} else {
return false
}
}

Custom camera , video is not playing with Audio in swift

I am new in swift also stake overflow. Advanced thank's for attention.
Basically am trying to build a custom camera that will record video with Audio. it means video will play with sound when i play this video. las few days i was try to build this custom camera. i already followed my tutorial but Still missing something from my camera. i was try as per my custom camera is only recording video. maybe it not recording audio. i don't understand. i was searching for this answer, not find appropriate answer for this.
here is What i did
import UIKit
import AVFoundation
import SVProgressHUD
import MediaPlayer
import MobileCoreServices
import AVKit
var videoUrl = [AnyObject]()
class TestViewController: UIViewController {
#IBOutlet var viewVidioPlayer: UIView!
#IBOutlet weak var myView: UIView!
var session: AVCaptureSession?
var userreponsevideoData = NSData()
var userreponsethumbimageData = NSData()
override func viewDidLoad() {
super.viewDidLoad()
}
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
}
// here i create session
func createSession() {
var input: AVCaptureDeviceInput?
let movieFileOutput = AVCaptureMovieFileOutput()
var prevLayer: AVCaptureVideoPreviewLayer?
prevLayer?.frame.size = myView.frame.size
session = AVCaptureSession()
let error: NSError? = nil
do {
input = try AVCaptureDeviceInput(device: self.cameraWithPosition(position: .front)!) } catch {return}
if error == nil {
session?.addInput(input)
} else {
print("camera input error: \(String(describing: error))")
}
prevLayer = AVCaptureVideoPreviewLayer(session: session)
prevLayer?.frame.size = myView.frame.size
prevLayer?.videoGravity = AVLayerVideoGravityResizeAspectFill
prevLayer?.connection.videoOrientation = .portrait
myView.layer.addSublayer(prevLayer!)
let documentsURL = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0]
let filemainurl = NSURL(string: ("\(documentsURL.appendingPathComponent("temp"))" + ".mp4"))
let maxDuration: CMTime = CMTimeMake(600, 10)
movieFileOutput.maxRecordedDuration = maxDuration
movieFileOutput.minFreeDiskSpaceLimit = 1024 * 1024
if self.session!.canAddOutput(movieFileOutput) {
self.session!.addOutput(movieFileOutput)
}
session?.startRunning()
movieFileOutput.startRecording(toOutputFileURL: filemainurl! as URL, recordingDelegate: self)
}
func cameraWithPosition(position: AVCaptureDevicePosition) -> AVCaptureDevice? {
let devices = AVCaptureDevice.devices(withMediaType: AVMediaTypeVideo)
for device in devices! {
if (device as AnyObject).position == position {
return device as? AVCaptureDevice
}
}
return nil
}
#IBAction func pressbackbutton(sender: AnyObject) {
session?.stopRunning()
}
#IBAction func Record(_ sender: Any) {
createSession()
}
#IBAction func play(_ sender: Any) {
self.videoPlay()
}
func videoPlay()
{
let documentsUrl = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first!
do {
// Get the directory contents urls (including subfolders urls)
let directoryContents = try FileManager.default.contentsOfDirectory(at: documentsUrl, includingPropertiesForKeys: nil, options: [])
print(directoryContents)
// if you want to filter the directory contents you can do like this:
videoUrl = directoryContents.filter{ $0.pathExtension == "mp4" } as [AnyObject]
print("mp3 urls:",videoUrl)
let playerController = AVPlayerViewController()
playerController.delegate = self as? AVPlayerViewControllerDelegate
let movieURL = videoUrl[0]
print(movieURL)
let player = AVPlayer(url: movieURL as! URL)
playerController.player = player
self.addChildViewController(playerController)
self.view.addSubview(playerController.view)
playerController.view.frame = self.view.frame
player.play()
player.volume = 1.0
player.rate = 1.0
} catch let error as NSError {
print(error.localizedDescription)
}
}
}
extension TestViewController: AVCaptureFileOutputRecordingDelegate
{
#available(iOS 4.0, *)
private func captureOutput(captureOutput: AVCaptureFileOutput!, didStartRecordingToOutputFileAtURL fileURL: URL!, fromConnections connections: [AnyObject]!) {
print(fileURL)
}
func capture(_ captureOutput: AVCaptureFileOutput!, didFinishRecordingToOutputFileAt outputFileURL: URL!, fromConnections connections: [Any]!, error: Error!) {
let filemainurl = outputFileURL
do
{
let asset = AVURLAsset(url: filemainurl! as URL, options: nil)
//AVURLAsset(URL: filemainurl as! URL, options: nil)
print(asset)
let imgGenerator = AVAssetImageGenerator(asset: asset)
imgGenerator.appliesPreferredTrackTransform = true
let cgImage = try imgGenerator.copyCGImage(at: CMTimeMake(0, 1), actualTime: nil)
let uiImage = UIImage(cgImage: cgImage)
userreponsethumbimageData = try NSData(contentsOf: filemainurl! as URL)
print(userreponsethumbimageData.length)
print(uiImage)
// imageData = UIImageJPEGRepresentation(uiImage, 0.1)
}
catch let error as NSError
{
print(error)
return
}
SVProgressHUD.show(with: SVProgressHUDMaskType.clear)
let VideoFilePath = NSURL(fileURLWithPath: NSTemporaryDirectory()).appendingPathComponent("mergeVideo\(arc4random()%1000)d")!.appendingPathExtension("mp4").absoluteString
if FileManager.default.fileExists(atPath: VideoFilePath)
{
do
{
try FileManager.default.removeItem(atPath: VideoFilePath)
}
catch { }
}
let tempfilemainurl = NSURL(string: VideoFilePath)!
let sourceAsset = AVURLAsset(url: filemainurl! as URL, options: nil)
let assetExport: AVAssetExportSession = AVAssetExportSession(asset: sourceAsset, presetName: AVAssetExportPresetMediumQuality)!
assetExport.outputFileType = AVFileTypeQuickTimeMovie
assetExport.outputURL = tempfilemainurl as URL
assetExport.exportAsynchronously { () -> Void in
switch assetExport.status
{
case AVAssetExportSessionStatus.completed:
DispatchQueue.main.async(execute: {
do
{
SVProgressHUD .dismiss()
self.userreponsevideoData = try NSData(contentsOf: tempfilemainurl as URL, options: NSData.ReadingOptions())
print("MB - \(self.userreponsevideoData.length) byte")
}
catch
{
SVProgressHUD .dismiss()
print(error)
}
})
case AVAssetExportSessionStatus.failed:
print("failed \(assetExport.error)")
case AVAssetExportSessionStatus.cancelled:
print("cancelled \(assetExport.error)")
default:
print("complete")
SVProgressHUD .dismiss()
}
}
}
}
There all i have done. so I don't understand what is missing from this code. Why audio is not playing with video or why not recoding audio with video.
Use this cocopods for your project. It makes your job quiet easy.
It has all instructions on what to do and also contains a demo project to test it works as you intended it to.
SwiftyCam

Swift and XCode 7 beta 2 are not displaying the metadata of the songs playing

I'm doing a test program dealing with audio and tableView, right now I have the controller working and playing the chosen song correctly, but I'm trying to display the metadata from the mp3 file and is not doing anything currently I have it like this:
Function to print the data:
func testAudioStuff(testURL testURL: NSURL) {
let audioInfo = AVPlayerItem(URL: testURL)
let metaDataList = audioInfo.asset.metadata as [AVMetadataItem]
//here it should print something it does not
for item in metaDataList {
if item.commonKey == nil {
continue
}
if let key = item.commonKey, let value = item.value {
print(key)
print(value as! String)
if key == "title" {
print(key)
print("here is the title" + (value as! String))
}
if key == "artist" {
print(key)
print("here is the artist" + (value as! String))
}
if key == "artwork" {
print("here is the artwork" + (value as! String))
}
}
}
}
Function to play the music:
func playSelectedMusic(song: Int, section: Int) {
if section == 1 {
//print(NSBundle.mainBundle().description + "\(song) \(section)")
if let starMusic = productsAndMusic["Music"] {
//print("We are looking at \(starMusic[song])")
let play = starMusic[song].componentsSeparatedByString(".")[0]
//print(play)
if let fileToPlay = NSBundle.mainBundle().pathForResource(play, ofType: "mp3") {
//print(fileToPlay)
testAudioStuff(testURL: NSURL(string: fileToPlay)!)
do {
player = try AVAudioPlayer(contentsOfURL: NSURL(string: fileToPlay)!)
} catch Errors.GeneralError {
} catch let error as NSError {
print("i dunno this does not comes with instructions \(error)")
} catch let something as ErrorType {
print("somehting else \(something)")
}
player.prepareToPlay()
player.play()
}
//player = AVAudioPlayer(contentsOfURL: NSURL(string: NSBundle.mainBundle().pathForResource(play, ofType: "mp3")!))
//player.prepareToPlay()
//print(NSBundle.mainBundle().pathForResource(play, ofType: "mp3"))
}
//print(NSBundle.mainBundle().pathForResource((productsAndMusic["Music"]![song] as String).componentsSeparatedByString(".")[0], ofType: "mp3")!)
//print((productsAndMusic["Music"]![song] as String).componentsSeparatedByString(".")[0])
} else {
}
}
As I mentioned it plays the selected song in the simulator but it does not print the metadata, why?, any help? Is in Xcode 7 beta 2.
I have tried something, but no dice, this:
var secondTestTitle: AVAsset = AVAsset(URL: testURL)
print(secondTestTitle.description)
var metaStuff: [AVMetadataItem] = secondTestTitle.commonMetadata as [AVMetadataItem]
print(metaStuff.count)
Result:
<AVURLAsset: 0x78ea3930, URL = /Users/pedro/Library/Developer/CoreSimulator/Devices/304FE5A7-9506-4A9B-B685-5CDBE9AFB4C4/data/Containers/Bundle/Applica ... ock1.mp3>
0
If this is an MP3 file in your app bundle, then let's suppose it is called test.mp3. Then you can say:
let url = NSBundle.mainBundle().URLForResource("test", withExtension:"mp3")
let asset = AVAsset(URL: url!)
let meta = asset.metadata
print(meta)
If you don't see any metadata in the console when you say that, then this file has no metadata.
However, that's not what I would do, because it's not a very realistic test — and it isn't likely to be very useful, because most information is not written into the file as metadata. More likely, you're going to want to know about a song in the user's Music library. In that case you would use the Media Player framework, as I describe in my book.

Resources