Video stream is not working with partial content byte streaming web API.
let headerFields:[String: String] = ["Range":"bytes"]
let avAsset = AVURLAsset(url:url, options: ["AVURLAssetHTTPHeaderFieldsKey": headerFields])
let playerItem = AVPlayerItem(asset: avAsset)
self.player = AVPlayer(playerItem: playerItem)
I tried to run a remote mpga format audio file with AVPlayer in ios and doesn't play. Below code works fine for remote mp3 files. Can i change the MIME type for the file loaded or are there any options to play the mpga file in ios.
let audioUrl = "https://example-files.online-convert.com/audio/mpga/example.mpga"
let url = URL(string: audioUrl)
let player: AVPlayer = AVPlayer(url: url!)
player.play()
console shows
nw_endpoint_flow_copy_multipath_subflow_counts Called on non-Multipath connection
let audioUrl = "https://example-files.online-convert.com/audio/mpga/example.mpga"
if let url = URL(string: audioUrl){
let playerItem = AVPlayerItem(url: url)
let player: AVPlayer(playerItem: playerItem)
player.play()
}
I want to play streaming audio on an iOS app. I use this code to play audio:
let url = URL(string: "http://online.radiorecord.ru:8102/chil_320.m3u")!
let asset = AVAsset(url: url)
let playerItem = AVPlayerItem(asset: asset)
let player = AVPlayer.init(playerItem: playerItem)
player.play()
When I run the app on the Simulator I get this error:
2018-11-30 21:57:26.097577+0300 radio[7417:21800733]
[AudioHAL_Client]
AudioHardware.cpp:1210:AudioObjectRemovePropertyListener:
AudioObjectRemovePropertyListener: no object with given ID 0
How to fix this issue?
var player: AVPlayer?
let url = URL(string: "http://online.radiorecord.ru:8102/chil_320.m3u")!
player = AVPlayer(url: url)
player?.play()
I've downloaded a file, now I'm trying to play it back
let paths = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true)[0] as NSString
let the = (paths as String) + "/" + (df.destinationURL?.lastPathComponent)!
print(the)
let fileUrl = NSURL.fileURL(withPath: the)
let asset = WKAudioFileAsset(url: fileUrl)
self.downloadProgress.setText(String(asset.duration))
let playerItem = WKAudioFilePlayerItem(asset: asset)
self.player = WKAudioFilePlayer(playerItem: playerItem)
print(self.player.description)
self.player.play()
No sound is played.
However, any playing music IS paused, asset.duration DOES return the correct duration for the file.
Despite this, no audio is played at all.
Using self.presentMediaController() DOES work (using the same file URL), however, this pauses when I leave the app.
Any ideas?
I recorded a 240 fps video after changing the AVCaptureDeviceFormat. If I save that video in the photo library, the slowmo effect is there. But, If I play that file from documents directory, using an AVPlayer, I cant see the slowmo effect.
Code to play the video:
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:[AVAsset assetWithURL:[NSURL fileURLWithPath:fullPath]]];
AVPlayer *feedVideoPlayer = [AVPlayer playerWithPlayerItem:playerItem];
AVPlayerViewController *playerController = [[AVPlayerViewController alloc] init];
playerController.view.frame = CGRectMake(0, 0, videoPreviewView.frame.size.width, videoPreviewView.frame.size.height);
playerController.player = feedVideoPlayer;
It's a bit annoying, but I believe you'll need to re-create the video in an AVComposition if you don't want to lose quality. I'd love to know if there is another way, but this is what I've come up with. You can technically export the video via AVAssetExportSession, but using a PassThrough quality will result in the same video file, which won't be slow motion- you'll need to transcode it, which loses quality (AFAIK. See Issue playing slow-mo AVAsset in AVPlayer for that solution).
The first thing you'll need to do is grab the source media's original time mapping objects. You can do that like so:
let options = PHVideoRequestOptions()
options.version = PHVideoRequestOptionsVersion.current
options.deliveryMode = .highQualityFormat
PHImageManager().requestAVAsset(forVideo: phAsset, options: options, resultHandler: { (avAsset, mix, info) in
guard let avAsset = avAsset else { return }
let originalTimeMaps = avAsset.tracks(withMediaType: AVMediaTypeVideo)
.first?
.segments
.flatMap { $0.timeMapping } ?? []
}
Once you have timeMappings of the original media (the one sitting in your documents directory), you can pass in the URL of that media and the original CMTimeMapping objects that you would like to recreate. Then create a new AVComposition that is ready to play in an AVPlayer. You'll need a class similar to this:
class CompositionMapper {
let url: URL
let timeMappings: [CMTimeMapping]
init(for url: URL, with timeMappings: [CMTimeMapping]) {
self.url = url
self.timeMappings = timeMappings
}
init(with asset: AVAsset, and timeMappings: [CMTimeMapping]) {
guard let asset = asset as? AVURLAsset else {
print("cannot get a base URL from this asset.")
fatalError()
}
self.timeMappings = timeMappings
self.url = asset.url
}
func compose() -> AVComposition {
let composition = AVMutableComposition(urlAssetInitializationOptions: [AVURLAssetPreferPreciseDurationAndTimingKey: true])
let emptyTrack = composition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)
let audioTrack = composition.addMutableTrack(withMediaType: AVMediaTypeAudio, preferredTrackID: kCMPersistentTrackID_Invalid)
let asset = AVAsset(url: url)
guard let videoAssetTrack = asset.tracks(withMediaType: AVMediaTypeVideo).first else { return composition }
var segments: [AVCompositionTrackSegment] = []
for map in timeMappings {
let segment = AVCompositionTrackSegment(url: url, trackID: kCMPersistentTrackID_Invalid, sourceTimeRange: map.source, targetTimeRange: map.target)
segments.append(segment)
}
emptyTrack.preferredTransform = videoAssetTrack.preferredTransform
emptyTrack.segments = segments
if let _ = asset.tracks(withMediaType: AVMediaTypeVideo).first {
audioTrack.segments = segments
}
return composition.copy() as! AVComposition
}
You can then use the compose() function of your CompositionMapper class to give you an AVComposition that is ready to play in an AVPlayer, which should respect the CMTimeMapping objects that you've passed in.
let compositionMapper = CompositionMapper(for: someAVAssetURL, with: originalTimeMaps)
let mappedComposition = compositionMapper.compose()
let playerItem = AVPlayerItem(asset: mappedComposition)
let player = AVPlayer(playerItem: playerItem)
playerItem.audioTimePitchAlgorithm = AVAudioTimePitchAlgorithmVarispeed
Let me know if you need help converting this to Objective-C, but it should be relatively straight forward.