Play continuous video stream of NSData - ios

I am working on a video calling application which involves streaming. I get continuous NSData steam in a delegate callback. Could anyone tell me how to render this continuous NSData stream in using Objective-C?
Should I use AVPlayer or MetalKit for rendering the data?
- (void) videoReceived: frame:(NSData*) data
This delegate method keeps getting called continuously while streaming (this is the place I get video data in the form of NSData). Could anyone tell me how to render it?

Here:
var videoData: Data // some video data
var fileURL: URL // some local path, preferably appending NSTemporaryDirectory()
try! videoData.write(to: fileURL)
let item = AVPlayerItem(url: fileURL)

Related

get audio duration with AVplayer - Swift

I'm trying to get my audio file duration to display on my app but my code always return 0. I hope there is a method that notifies me when the AVplayer has data from the file and then I can call my code after that to get the data. Any suggestions?
func loadAudioUrl() {
guard let url = URL(string: sampleShortAudioUrl) else {return}
audioPlayer = AVPlayer(url: url)
audioPlayer?.play()
if let duration = audioPlayer?.currentItem?.duration{
print(duration)
}
}
You can get duration, but you need to wait because content is loading. Your code assumes that it is loaded instantly.
You need to use AVPlayerItem with AVPlayer.
When AVPlayerItem status is ready to play, you can ask for duration. Complete code example is right from Apple here:
AVPlayerItem
You can't get duration from a remote audio url , you must store it's duration remotely in your database and grap it while listening or downloading . . .

Downloading and playing offline HLS Content - iOS 10

Since iOS 10, Apple has provided the support for downloading HLS (m3u8) video for offline viewing.
My question is: Is it necessary that we can only download HLS when it is being played ? Or we can just download when user press download button and show progress.
Does anyone has implemented this in Objective C version? Actually my previous App is made in Objective C. Now I want to add support for downloading HLS rather than MP4 (previously I was downloading MP4 for offline view).
I am really desperate to this. Please share thoughts or any code if implemented.
I used the apple code guid to download HLS content with the following code:
var configuration: URLSessionConfiguration?
var downloadSession: AVAssetDownloadURLSession?
var downloadIdentifier = "\(Bundle.main.bundleIdentifier!).background"
func setupAssetDownload(videoUrl: String) {
// Create new background session configuration.
configuration = URLSessionConfiguration.background(withIdentifier: downloadIdentifier)
// Create a new AVAssetDownloadURLSession with background configuration, delegate, and queue
downloadSession = AVAssetDownloadURLSession(configuration: configuration!,
assetDownloadDelegate: self,
delegateQueue: OperationQueue.main)
if let url = URL(string: videoUrl){
let asset = AVURLAsset(url: url)
// Create new AVAssetDownloadTask for the desired asset
let downloadTask = downloadSession?.makeAssetDownloadTask(asset: asset,
assetTitle: "Some Title",
assetArtworkData: nil,
options: nil)
// Start task and begin download
downloadTask?.resume()
}
}//end method
func urlSession(_ session: URLSession, assetDownloadTask: AVAssetDownloadTask, didFinishDownloadingTo location: URL) {
// Do not move the asset from the download location
UserDefaults.standard.set(location.relativePath, forKey: "testVideoPath")
}
if you don't understand what's going on, read up about it here:
https://developer.apple.com/library/archive/documentation/AudioVideo/Conceptual/MediaPlaybackGuide/Contents/Resources/en.lproj/HTTPLiveStreaming/HTTPLiveStreaming.html
now you can use the stored HSL content to play the video in AVPlayer with the following code:
//get the saved link from the user defaults
let savedLink = UserDefaults.standard.string(forKey: "testVideoPath")
let baseUrl = URL(fileURLWithPath: NSHomeDirectory()) //app's home directory
let assetUrl = baseUrl.appendingPathComponent(savedLink!) //append the saved link to home path
now use the path to play video in AVPlayer
let avAssest = AVAsset(url: assetUrl)
let playerItem = AVPlayerItem(asset: avAssest)
let player = AVPlayer(playerItem: playerItem) // video path coming from above function
let playerViewController = AVPlayerViewController()
playerViewController.player = player
self.present(playerViewController, animated: true, completion: {
player.play()
})
The only way you can do this is to set up an HTTP server to serve the files locally after you've downloaded them.
The Live playlist uses a sliding-window. You need to periodically reload it after target-duration time and download only the new segments as they appear in the list (they will be removed at a later time).
Here are some related answers: Can IOS devices stream m3u8 segmented video from the local file system using html5 video and phonegap/cordova?
You can easily download an HLS stream with AVAssetDownloadURLSession makeAssetDownloadTask. Have a look at the AssetPersistenceManager in Apples Sample code: https://developer.apple.com/library/content/samplecode/HLSCatalog/Introduction/Intro.html
It should be fairly straight forward to use the Objective C version of the api.
Yes, you can download video stream served over HLS and watch it later.
There is a very straight forward sample app (HLSCatalog) from apple on this. The code is fairly simple. you can find it here - https://developer.apple.com/services-account/download?path=/Developer_Tools/FairPlay_Streaming_Server_SDK_v3.1/FairPlay_Streaming_Server_SDK_v3.1.zip
You can find more about offline HLS streaming here.

How to save recorded audio iOS?

I am developing an application in which audio is being recorded and being transcribed to text. I am using the Speechkit provided by Nuance Developers.
The functions I am adding are:
Save the recorded audio file to persistent memory
Display the audio files in a table view
Load the saved audio files later
Play the audio files
How do I save the audio files to persistent storage?
Here's the code : https://gist.github.com/buildFlash/48d143217b721823ff4c3c03a925ba55
When you record audio with AVAudioRecorder then you have to pass path as url of the location where you are storing your audio. so by defaul it's store audio at that location.
for example,
var audioSession:AVAudioSession = AVAudioSession.sharedInstance()
audioSession.setCategory(AVAudioSessionCategoryPlayAndRecord, error: nil)
audioSession.setActive(true, error: nil)
var documents: AnyObject = NSSearchPathForDirectoriesInDomains( NSSearchPathDirectory.DocumentDirectory, NSSearchPathDomainMask.UserDomainMask, true)[0]
var str = documents.stringByAppendingPathComponent("myRecording1.caf")
var url = NSURL.fileURLWithPath(str as String)
var recordSettings = [AVFormatIDKey:kAudioFormatAppleIMA4,
AVSampleRateKey:44100.0,
AVNumberOfChannelsKey:2,AVEncoderBitRateKey:12800,
AVLinearPCMBitDepthKey:16,
AVEncoderAudioQualityKey:AVAudioQuality.Max.rawValue
]
println("url : \(url)")
var error: NSError?
audioRecorder = AVAudioRecorder(URL:url, settings: recordSettings, error: &error)
if let e = error {
println(e.localizedDescription)
} else {
audioRecorder.record()
}
So, here url is the location where your audio is stored and you can use that same url to play that audio. and you can get that file from url or path as data if you want to send it to server.
So, if you are using third party library then check that where it is storing audio and you can get it from there or it should have some method to get the location of it.
PS : there is no need to use third party library to record audio because you can easly manage it via AVAudioRecorder and AVAudioPlayer (for playing audio from url).
Inshort if you are recording audio then you definitely parallel storing it also!
You can refer Ravi shankar's tutorial also
Reference : this so post

How to convert a video NSURL to an ALAsset?

Facebook sharing requires an ALAsset such as the following:
let content = FBSDKShareVideoContent()
//The videos must be less than 12MB in size.
let bundle = NSBundle.mainBundle()
let path = bundle.URLForResource("a", withExtension: "mp4")
let video = FBSDKShareVideo()
// doesn't work; needs to be an "asset url" (ALAsset)
//video.videoURL = path
content.video = video
let dialog = FBSDKShareDialog()
dialog.shareContent = content
dialog.show()
How is it possible to take a local bundle document, or an NSData object, and convert it to an ALAsset?
(My initial thinking was saving the video to the local camera roll, and then loading the list and selecting it, but that is unnecessary interface steps)
The documentation for an ALAsset states that
An ALAsset object represents a photo or a video managed by the Photo application.
so I'm pretty sure that you have to write the video to the camera roll before using it as an ALAsset. However, you don't need to open the camera roll and have a user pick the asset in order to use it. When writing to the ALAssetLibrary using
library.writeVideoAtPathToSavedPhotosAlbum(movieURL, completionBlock: { (newURL, error) -> Void
you get the asset url in that newUrl completion block variable. Use it in the Facebook sharing call
let content = FBSDKShareVideoContent()
content.video = FBSDKShareVideo(videoURL: newURL)
FBSDKShareAPI.shareWithContent(content, delegate: self)
NSLog("Facebook content shared \(content.video.videoURL)")
You can do this sharing inside the completion block if you so desire, or you can save the newUrl from the completion block and use it somewhere else.

Swift/iOS: AVAudioPlayer does not play sound in NSData format

I am trying to create an AVAudioPlayer that plays NSData downloaded from Parse.
I am pretty certain the sound (.wav format) has been uploaded to Parse. I am also certain that the sound can be downloaded from Parse in the NSData format. So I am creating an AVAudioPlayer object using the downloaded data from Parse:
if audioData != nil {
print("successful downloading audio!") //this prints out
let audioPlayer = try! AVAudioPlayer(data: audioData!, fileTypeHint: AVFileTypeWAVE)
audioPlayer.prepareToPlay()
audioPlayer.volume = 0.5
audioPlayer.play()
}
As you can see above, the audioPlayer is created, but it does not play the sound. Where might be wrong?
Is this code within a function? If so, your making the audioPlayer variable locally. This means that the audioPlayer is created, starts playing and then is deallocated (at the end of the function call), resulting in no audio. Your audioPlayer object needs to be a class property or within global space (such as a singleton for example), so the object life persists after the function ends.

Resources