I'm using AVPlayer to play URL's i'm fetching from my backend. Initially, I was downloading the items to my documents directory and used the urls to play the files via AVAudioPlayer. I switched over to AVPlayer so I can stream the audio instead of downloading them. I see that the URL's are being fetched successfully, but once I try to play them I get no audio. Below is an example of a URL i'm fetching:
/Users/ellie/Desktop/ellie/sound/uploads/ellie1/Track5.m4a
var player: AVPlayer!
var fetchedURL: NSURL?
func tableView(tableView: UITableView, didSelectRowAtIndexPath indexPath: NSIndexPath) {
//I left out the fetching process
self.fetchedURL = NSURL(string:parseString!)
print("fetchedURL is \(self.fetchedURL!)")
self.playCell()
}
func playCell() {
let audioSession: AVAudioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setCategory(AVAudioSessionCategoryPlayback)
} catch _ {
}
do {
try audioSession.setActive(true)
} catch _ {
}
print("fetchedURL is \(self.fetchedURL!)")
player = AVPlayer(URL:self.fetchedURL!)
player.play()
}
The AVPlayer will play only locally and remotely hosted video files, and will also play streaming links. To stream you need to ensure that a proper streaming link is used, some examples found here: https://stackoverflow.com/questions/10104301/hls-streaming-video-url-need-for-testing.
Note, that it is not a trivial task to convert a video file into a hosted streaming link. Services such as vimeo provide the ability to upload and encode video files, however will provide you with a streaming link only under the 'pro' version.
Other options include configure AWS S3 buckets to host and encode your video files. http://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/TutorialStreamingJWPlayer.html
Related
I'm selecting the audio file from the folder and getting the URL in this delegate function
func documentPicker(_ controller: UIDocumentPickerViewController, didPickDocumentAt url: URL) {
print("url", url)
self.filePickedBlock?(url)
}
Url looks like this
file:///private/var/mobile/Containers/Data/Application/8012C1BB-43E3-4E0C-9167-B7B5A49824A1/tmp/com.ios.PrankYourFriends-Inbox/AUDIO1.m4a
I am saving this URL as a string in Realm Database and fetching it when I want to play.
Issue I am Facing
When I play this Audio immediately after selecting then it's playing fine. But if I go to any other ViewController and Came back again and press the play button it's not playing the audio.
I am playing Audio like this:
self.avPlayer = AVPlayer.init(url: filePath)
self.avPlayer?.play()
Kindly let me know what I am doing wrong.
Its in tmp folder
/tmp/com.ios.PrankYourFriends-Inbox/AUDIO1.m4a
Copy it to say documents/library then use it any time
I want to play an Audio file (A wav file for example) and at specific locations of the track I want to fire events or triggers that will control an external device.
My idea for now is to generate a MIDI track that plays in sync with the Audio Track and when the MIDI track notes are played, some trigger events are generated that we can handle to do whatever we want.
The thing where I am stuck right now is how to play the .mid file and generate events when midi notes are played. I also want to play the wav and the mid file in sync, but that is not what I am solving at this point.
I looked into AudioKit, but the examples seem out of date and the documentation isn't helping a lot.
Is MIDI a right approach to do this? is there an easier way in iOS where I don't have to use AudioKit and just use something from AVFoundation.
I want to understand what tool is best to detect when a midi note from the .mid file is played and handle the event.
My research has pointed me to use AKAppleSequencer. What could help is a simple example that loads a midi file and then basically prints something when a note is played.
I came across these posts,
How to connect AKSequencer to a AKCallbackInstrument?
Play MIDI file together with wav AudioKit
but the AKSequencer is now replaced by AKAppleSequencer.
So I figured it out. The answer was basically in the posts above just updated the code so it uses AKAppleSequencer.
let sequencer = AKAppleSequencer(filename: "SaReGaMa") // the .mid file
let callbackInstr = AKMIDICallbackInstrument()
var player: AKPlayer!
func initializeSession() {
callbackInstr.callback = myCallBack
sequencer.setGlobalMIDIOutput(callbackInstr.midiIn)
if let audioFile = try? AKAudioFile(readFileName: "SaReGaMa.wav") {
player = AKPlayer(audioFile: audioFile)
player.completionHandler = { print("Finished playing file")}
player.buffering = .always
AudioKit.output = player
do {
try AudioKit.start()
} catch {
print("Error starting audiokit, \(error)")
}
}
}
// The callback gets triggered when each midi note is played by the sequencer.
func myCallBack(a: UInt8, b:MIDINoteNumber, c:MIDIVelocity) -> () {
print(a,b,c);
}
// These functions let you control the playback.
func play() {
player.play()
sequencer.play()
}
func pause() {
sequencer.stop()
player.pause()
}
I need to create a Audio Player for streamed URL (m3u8 format). I have created music player using AVPlayer. But I need to show visualizer for streamed song. I have tried different solution but not found any working example of it.
I have created visualizer using AVAudioPlayer(averagePower) but it won't support streamed URL.
Any help to show visualizer for AVPlayer? Thanks in advance.
I have also tried using MYAudioTapProcessor which most of the people suggested, but for streamed URL, tracks always returns null.
Added the MYAudioTapProcessor.h and MYAudioTapProcessor.m in project
//Initialization of player
let playerItem = AVPlayerItem( url:NSURL( string:"https://bitdash-a.akamaihd.net/content/sintel/hls/playlist.m3u8" ) as! URL )
let audioPlayer: AVPlayer = AVPlayer(playerItem:playerItem)
//Added periodic time observer
audioPlayer!.addPeriodicTimeObserver(forInterval: CMTimeMakeWithSeconds(1, 1), queue: DispatchQueue.main) { (CMTime) -> Void in
if audioPlayer!.currentItem?.status == .readyToPlay
{
if let playerItem: AVPlayerItem = audioPlayer!.currentItem {
print(playerItem.asset.tracks.count)
if (playerItem.asset.tracks) != nil {
self.tapProcessor = MYAudioTapProcessor(avPlayerItem: playerItem)
playerItem.audioMix = self.tapProcessor.audioMix
self.tapProcessor.delegate = self
}
}
}
}
//Delegate callback method for MYAudioTapProcessor
func audioTabProcessor(_ audioTabProcessor: MYAudioTapProcessor!, hasNewLeftChannelValue leftChannelValue: Float, rightChannelValue: Float) {
print("volume: \(leftChannelValue) : \(rightChannelValue)")
volumeSlider.value = leftChannelValue
}
Also tried by adding the "Track" observer.
playerItem.addObserver(self, forKeyPath: "tracks", options: NSKeyValueObservingOptions.new, context: nil);
Now if play mp3 file, the callback method calls but for m3u8 callback method doesn't call. The main reason for failing m3u8 URL is it always show tracks array count zero whereas for mp3 files tracks array has one item.
You cannot get tracks for HLS via AVPLayer. You should use progressive download or local file for getting audio tracks while playing media.
Since iOS 10, Apple has provided the support for downloading HLS (m3u8) video for offline viewing.
My question is: Is it necessary that we can only download HLS when it is being played ? Or we can just download when user press download button and show progress.
Does anyone has implemented this in Objective C version? Actually my previous App is made in Objective C. Now I want to add support for downloading HLS rather than MP4 (previously I was downloading MP4 for offline view).
I am really desperate to this. Please share thoughts or any code if implemented.
I used the apple code guid to download HLS content with the following code:
var configuration: URLSessionConfiguration?
var downloadSession: AVAssetDownloadURLSession?
var downloadIdentifier = "\(Bundle.main.bundleIdentifier!).background"
func setupAssetDownload(videoUrl: String) {
// Create new background session configuration.
configuration = URLSessionConfiguration.background(withIdentifier: downloadIdentifier)
// Create a new AVAssetDownloadURLSession with background configuration, delegate, and queue
downloadSession = AVAssetDownloadURLSession(configuration: configuration!,
assetDownloadDelegate: self,
delegateQueue: OperationQueue.main)
if let url = URL(string: videoUrl){
let asset = AVURLAsset(url: url)
// Create new AVAssetDownloadTask for the desired asset
let downloadTask = downloadSession?.makeAssetDownloadTask(asset: asset,
assetTitle: "Some Title",
assetArtworkData: nil,
options: nil)
// Start task and begin download
downloadTask?.resume()
}
}//end method
func urlSession(_ session: URLSession, assetDownloadTask: AVAssetDownloadTask, didFinishDownloadingTo location: URL) {
// Do not move the asset from the download location
UserDefaults.standard.set(location.relativePath, forKey: "testVideoPath")
}
if you don't understand what's going on, read up about it here:
https://developer.apple.com/library/archive/documentation/AudioVideo/Conceptual/MediaPlaybackGuide/Contents/Resources/en.lproj/HTTPLiveStreaming/HTTPLiveStreaming.html
now you can use the stored HSL content to play the video in AVPlayer with the following code:
//get the saved link from the user defaults
let savedLink = UserDefaults.standard.string(forKey: "testVideoPath")
let baseUrl = URL(fileURLWithPath: NSHomeDirectory()) //app's home directory
let assetUrl = baseUrl.appendingPathComponent(savedLink!) //append the saved link to home path
now use the path to play video in AVPlayer
let avAssest = AVAsset(url: assetUrl)
let playerItem = AVPlayerItem(asset: avAssest)
let player = AVPlayer(playerItem: playerItem) // video path coming from above function
let playerViewController = AVPlayerViewController()
playerViewController.player = player
self.present(playerViewController, animated: true, completion: {
player.play()
})
The only way you can do this is to set up an HTTP server to serve the files locally after you've downloaded them.
The Live playlist uses a sliding-window. You need to periodically reload it after target-duration time and download only the new segments as they appear in the list (they will be removed at a later time).
Here are some related answers: Can IOS devices stream m3u8 segmented video from the local file system using html5 video and phonegap/cordova?
You can easily download an HLS stream with AVAssetDownloadURLSession makeAssetDownloadTask. Have a look at the AssetPersistenceManager in Apples Sample code: https://developer.apple.com/library/content/samplecode/HLSCatalog/Introduction/Intro.html
It should be fairly straight forward to use the Objective C version of the api.
Yes, you can download video stream served over HLS and watch it later.
There is a very straight forward sample app (HLSCatalog) from apple on this. The code is fairly simple. you can find it here - https://developer.apple.com/services-account/download?path=/Developer_Tools/FairPlay_Streaming_Server_SDK_v3.1/FairPlay_Streaming_Server_SDK_v3.1.zip
You can find more about offline HLS streaming here.
I'm trying to play a video of type avi in an application created in xcode written in swift.
The thing is all of the videos I have are in avi and I cannot afford time spent converting the videos right now. Is there a way to play the video in the provided Media player library?
the code I wrote plays the audio of the video only:
func playVideo() {
let path = NSBundle.mainBundle().pathForResource(symbol, ofType:"avi")
let url = NSURL.fileURLWithPath(path!)
moviePlayer = MPMoviePlayerController(contentURL: url)
if let player = moviePlayer {
player.view.frame = self.view.bounds
player.prepareToPlay()
player.scalingMode = .AspectFill
self.view.addSubview(player.view)
}
or maybe there's a way to convert a video when the function is called?
I,
following the documentation of the MPMediaPlayer :
For movie files, this typically means files with the extensions .mov, .mp4, .mpv, and .3gp
So, unfortunately, you can't use the MPMediaPlayer to read your avi video.
Thanks to VLC for iOs source code, you can re-use it to implement a custom player to read these files.