I'm trying to play MP3 file via AVPlayer:
let url = URL(string: "http://transom.org/wp-content/uploads/2004/03/stereo_40kbps.mp3?_=7")!
let asset = AVURLAsset(url: url)
let item = AVPlayerItem(asset: asset)
let player = AVPlayer(playerItem: item)
player.play()
But I'm getting the next log:
2017-09-26 21:57:07.906598+0300 MyApp[7558:1177816] CredStore - performQuery - Error copying matching creds. Error=-25300, query={
class = inet;
"m_Limit" = "m_LimitAll";
"r_Attributes" = 1;
sync = syna;
}
I guess it's because of iOS 11 and Xcode 9, but I have no idea how to solve this problem.
The problem seems to be with the App Transport security, after enabling it, it worked fine with the following sets of code in iOS 11,
Also the above URL that you have provided seems to have associated https link, please use either https link or allow App Transport security
let avPlayerVC = AVPlayerViewController()
let url = URL(string: "https://transom.org/wp-content/uploads/2004/03/stereo_40kbps.mp3?_=7")!
let asset = AVURLAsset(url: url)
let item = AVPlayerItem(asset: asset)
let player = AVPlayer(playerItem: item)
avPlayerVC.player = player
present(avPlayerVC, animated: true) {
player.play()
}
But here I have used AVPlayerViewController and used the same instance of AVPlayer as you used in your code.
I don't know how you are using the AVPlayer in your case, but the above case worked fine.
same error.
I've solved with server-side work.
remove HSTS header "Strict-Transport-Security: max-age=31536000" from http response, disappear error and work fine. I don't know the cause..
request : https://abc.../aaaa.mp3
I've faced the same issue, we've tried to set all the headers on server side to inactive/active
Finally unset range was set to inactive and then it worked.
Related
We have successfully configured Subtitles/Captions in Azure Media Player which plays media on the Web side.
But, how do we configure the same for playing the media managed by AMS in iOS for Native AVPlayer? We know that captions/subtitles can be played in native iOS player with Sidecar WebVTT file, but is the "transcript.vtt" file generated by AMS, the Sidecar WebVTT file ?
If not, how do we generate the sidecar WebVTT file?
We have implemented the code as below with Media file being accessed from AMS link and a locally downloaded transcript.vtt file, but it fails.
[EDITED : 20200413]
However, when we have local media file and local transcript.vtt file, or when we directly access the media file in the media storage account (https://mediastorageaccount.blob.core.windows.net/container/file.mp4) it works fine. But, when we access the encoded file from the link generated by AMS Transform (https://mediaservice-inct.streaming.media.azure.net/788888-6666-4444-aaaa-823422j218/file.ism/manifest(format=m3u8-cmaf)) it fails.
What is wrong here?
func playVideo()
{
let strUrl = "https://mediaservice-inct.streaming.media.azure.net/79510-6eb-340-a90-824218/German-FAST_Lesson-2-Dialog.ism/manifest(format=m3u8-cmaf)"
localVideoAsset = AVURLAsset(url: URL(string: strUrl)!)
//We have to add tracks in AVMutableComposition same like bellow
//First we have to add video track on AVMutableComposition
let videoTrack = videoPlusSubtitles.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid)
do{
guard localVideoAsset!.tracks.count > 0 else{
// error msg
return
}
try? videoTrack?.insertTimeRange(CMTimeRangeMake(start: CMTime.zero, duration: localVideoAsset!.duration),
of:localVideoAsset!.tracks(withMediaType: .video)[0],
at: seconds)
}
//After that we have to add subtitle track in AVMutableComposition
if isEnglishSubtitle {
setSubtitleTrack(subtitle: "transcript")
}else{
setSubtitleTrack(subtitle: "transcript_tr")
}
//After set the video track and subtitle track we have to set in the player same like bellow
player = AVPlayer(playerItem: AVPlayerItem(asset: videoPlusSubtitles))
playerLayer.removeFromSuperlayer()
playerLayer = AVPlayerLayer(player: player)
playerLayer.frame = self.videoView.bounds
playerLayer.videoGravity = .resizeAspect
self.videoView.layer.addSublayer(playerLayer)
player.play()
}
func setSubtitleTrack(subtitle : String){
print(subtitle)
print(seconds)
//Here we have to check if any pre track available. If available then we have to remove it same like bellow
if subtitleTrack != nil{
videoPlusSubtitles.removeTrack(subtitleTrack!)
}
//We have to get subtitle file from path same like bellow
let subtitleAsset = AVURLAsset(url: Bundle.main.url(forResource: subtitle, withExtension: ".vtt")!)
// And we have to add new track from here
subtitleTrack = videoPlusSubtitles.addMutableTrack(withMediaType: .text, preferredTrackID: kCMPersistentTrackID_Invalid)
do{
guard subtitleAsset.tracks.count > 0 else{
//error msg
return
}
try? subtitleTrack?.insertTimeRange(CMTimeRangeMake(start: CMTime.zero, duration: localVideoAsset!.duration),
of:subtitleAsset.tracks(withMediaType: .text)[0],
at: seconds)
}
}
I suspect the issue is not caused by the AMS stream. To double check, you may want to try using another stream HLS(e.g. try a HLS provided by Apple). Apple has specific requirements for playing VTT in AVPlayer. I've included an Apple doc link which has a lot of examples on streaming, and other links that may be helpful:
https://developer.apple.com/streaming/examples/
How to add external .vtt subtitle file to AVPlayerViewController in tvOS
AVUrlAsset and WebVTTs
I would like to play a video file in my ViewController which is loaded in every page of my PageViewController. As you will be able to see I use a plugin called Carlos to cache the videos (which initially need to be downloaded from a server) so that they do not have to be downloaded every time the user hits a new page. However, I can't figure a way out how to play this downloaded file (NSData). Thus, I would really like to know how I can get the URL of the downloaded file so that I can play it using AVPlayer.
Code (still using URL from server)
let omniCache = videoCache.cache
let request = omniCache.get(URL(string: video!)!)
request
.onSuccess { videoFile in
print("The file..." )
print(videoFile)
//How can I get the local URL here instead of my server url
if let videoURL = URL(string: self.video!){
if self.player == nil {
let playerItemToBePlayed = AVPlayerItem(url: videoURL as URL)
self.player = AVPlayer(playerItem: playerItemToBePlayed)
let playerLayer = AVPlayerLayer(player: self.player)
playerLayer.frame = self.view.frame
self.controlsContainerView.layer.insertSublayer(playerLayer, at: 0)
}
}
}
.onFailure { error in
print("An error occurred :( \(error)")
}
Look at this code of yours:
videoFile in
print("The file..." )
print(videoFile)
if let videoURL = URL(string: self.video!){
So in the first line you print videoFile, which turns out to be the data of the file. But then you ignore it! You never mention videoFile again. Why do you ignore it? That is the data, you already have the data. Now play it!
If the data is a file, get its file URL and play it. If it is in memory — it definitely should not be, because a video held entirely in memory would crash your program — save it, and get that file URL and play it.
[I have to ask, however, why you are interposing this cache plug-in between yourself and such a simple task. Why don't you just download the remote video to disk, yourself?]
Since iOS 10, Apple has provided the support for downloading HLS (m3u8) video for offline viewing.
My question is: Is it necessary that we can only download HLS when it is being played ? Or we can just download when user press download button and show progress.
Does anyone has implemented this in Objective C version? Actually my previous App is made in Objective C. Now I want to add support for downloading HLS rather than MP4 (previously I was downloading MP4 for offline view).
I am really desperate to this. Please share thoughts or any code if implemented.
I used the apple code guid to download HLS content with the following code:
var configuration: URLSessionConfiguration?
var downloadSession: AVAssetDownloadURLSession?
var downloadIdentifier = "\(Bundle.main.bundleIdentifier!).background"
func setupAssetDownload(videoUrl: String) {
// Create new background session configuration.
configuration = URLSessionConfiguration.background(withIdentifier: downloadIdentifier)
// Create a new AVAssetDownloadURLSession with background configuration, delegate, and queue
downloadSession = AVAssetDownloadURLSession(configuration: configuration!,
assetDownloadDelegate: self,
delegateQueue: OperationQueue.main)
if let url = URL(string: videoUrl){
let asset = AVURLAsset(url: url)
// Create new AVAssetDownloadTask for the desired asset
let downloadTask = downloadSession?.makeAssetDownloadTask(asset: asset,
assetTitle: "Some Title",
assetArtworkData: nil,
options: nil)
// Start task and begin download
downloadTask?.resume()
}
}//end method
func urlSession(_ session: URLSession, assetDownloadTask: AVAssetDownloadTask, didFinishDownloadingTo location: URL) {
// Do not move the asset from the download location
UserDefaults.standard.set(location.relativePath, forKey: "testVideoPath")
}
if you don't understand what's going on, read up about it here:
https://developer.apple.com/library/archive/documentation/AudioVideo/Conceptual/MediaPlaybackGuide/Contents/Resources/en.lproj/HTTPLiveStreaming/HTTPLiveStreaming.html
now you can use the stored HSL content to play the video in AVPlayer with the following code:
//get the saved link from the user defaults
let savedLink = UserDefaults.standard.string(forKey: "testVideoPath")
let baseUrl = URL(fileURLWithPath: NSHomeDirectory()) //app's home directory
let assetUrl = baseUrl.appendingPathComponent(savedLink!) //append the saved link to home path
now use the path to play video in AVPlayer
let avAssest = AVAsset(url: assetUrl)
let playerItem = AVPlayerItem(asset: avAssest)
let player = AVPlayer(playerItem: playerItem) // video path coming from above function
let playerViewController = AVPlayerViewController()
playerViewController.player = player
self.present(playerViewController, animated: true, completion: {
player.play()
})
The only way you can do this is to set up an HTTP server to serve the files locally after you've downloaded them.
The Live playlist uses a sliding-window. You need to periodically reload it after target-duration time and download only the new segments as they appear in the list (they will be removed at a later time).
Here are some related answers: Can IOS devices stream m3u8 segmented video from the local file system using html5 video and phonegap/cordova?
You can easily download an HLS stream with AVAssetDownloadURLSession makeAssetDownloadTask. Have a look at the AssetPersistenceManager in Apples Sample code: https://developer.apple.com/library/content/samplecode/HLSCatalog/Introduction/Intro.html
It should be fairly straight forward to use the Objective C version of the api.
Yes, you can download video stream served over HLS and watch it later.
There is a very straight forward sample app (HLSCatalog) from apple on this. The code is fairly simple. you can find it here - https://developer.apple.com/services-account/download?path=/Developer_Tools/FairPlay_Streaming_Server_SDK_v3.1/FairPlay_Streaming_Server_SDK_v3.1.zip
You can find more about offline HLS streaming here.
I am trying to use an AVPlayer to play a video that has been recorded in my app. However, the player won't play the video. I know for a fact that this is a properly recorded mp4 file, because I can take it and play it on my Mac just fine. Here's the setup for the player:
let documents = NSFileManager.defaultManager().URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask).first!
let URL = NSURL(fileURLWithPath: "tempVideo", relativeToURL: documentsDirectory)
let asset = AVAsset(URL: URL)
let item = AVPlayerItem(asset: asset)
//videoPlayer is a property on the view controller being used
videoPlayer = AVPlayer()
//videoPlayerLayer is a property on the view controller being used
videoPlayerLayer = AVPlayerLayer(player: videoPlayer)
videoPlayerLayer.frame.size = view.frame.size
videoPlayerLayer.backgroundColor = UIColor.redColor().CGColor
view.layer.addSublayer(videoPlayerLayer!)
//wait 5 seconds
videoPlayer.play()
I know for sure that the videoPlayer is, in fact, ready to play, because I've checked its status property. I also know that videoPlayerLayer has properly been added to view.layer because its visible and takes up the whole screen. When I call videoPlayer.play(), the music playing on the device stops, but videoPlayerLayer doesn't show anything.
Any ideas? Thank you in advance for the help!
EDIT: I forgot to show that videoPlayerLayer is indeed connected to videoPlayer, I have updated my question to reflect this.
The correct answer was given by #Dershowitz123, but he or she left it in a comment so I can't mark it as correct. The solution was to change the URL to include the .mp4 extension. Thank you for your help.
Is it possible to send cookie with the AVPlayer url?I have a livestream which is AES encrypted and needs a key to decrypt.It will hit the server and the server returns the key only if session is there.So I want to send phpsessionid along with the url to AVPlayer.
Is it possible? I saw AVURLAssetHTTPHeaderFieldsKey.I don't know if it is what I have to set.If so how to do it?
This is how you can set Signed Cookies (Headers) in AVPlayer URL Request :
fileprivate func setPlayRemoteUrl() {
if playUrl.isEmpty { return }
let cookiesArray = HTTPCookieStorage.shared.cookies!
let values = HTTPCookie.requestHeaderFields(with: cookiesArray)
let cookieArrayOptions = ["AVURLAssetHTTPHeaderFieldsKey": values]
let assets = AVURLAsset(url: videoURL! as URL, options: cookieArrayOptions)
let item = AVPlayerItem(asset: assets)
player = AVPlayer(playerItem: item)
playerLayer = AVPlayerLayer(player: player)
playerLayer?.videoGravity = AVLayerVideoGravityResizeAspectFill
playerLayer?.contentsScale = UIScreen.main.scale
layer.insertSublayer(playerLayer!, at: 0)
}
In your case FPS(FairPlay Streaming) by apple will work. FairPlay Streaming is DRM(Digital Right Management) support where you will get content key along with your content data and you need to pass through delegate which supports AES-128 encrypt. Please refer below link which i shared below
https://developer.apple.com/streaming/fps/
I haven't really tried it myself but it seems there's an API that let's you create AVURLAsset with options. One of the possible option key is AVURLAssetHTTPCookiesKey. You might want to look into that.