Fetch Video Detail From HTTP Live Streaming using AVPlayerViewController - ios

I am currently working on HTTP Live streaming video with AVPlayerViewController / AVPlayer
i am play a video with .m38u file supported
It playing fine but my question is that can i have the data of video like i have set 4 type of resolution while genrating the .m38u file.. , i can varies the resoltion at my end point now the question is how to get the all values which i have setup at my endpoint. I am play this is in android also and using Track i am able to fetch all the video information but in ios how can i fetch all the details containing the video like its height , with, track,resolution supported by video etc..
I have search alot but could not get succeed..
Need help
Thanks in advance

Anita, I start I hope here is slice of code for playing a VIDEO..
self.command.text = "Loading VIDEO"
let videoURL = self.currentSlide.aURL
self.playerItem = AVPlayerItem(URL: videoURL)
self.player = AVPlayer(playerItem: self.playerItem)
self.playerLayer = AVPlayerLayer(player: self.player)
self.streamPlayer = AVPlayerViewController()
self.streamPlayer.player = self.player
self.streamPlayer.view.frame = CGRect(x: 128, y: 222, width: 512, height: 256)
let currentFM = self.streamPlayer.player?.currentItem?.asset
for blah in (currentFM?.metadata.enumerate())! {
print("blah \(blah)")
}
self.presentViewController(self.streamPlayer, animated: true)
self.streamPlayer.player!.play()
}
I added a little extra for showing meta data about the video... it printed...
blah (0, <AVMetadataItem: 0x15e7e4280, identifier=itsk/gsst, keySpace=itsk, key class = __NSCFNumber, key=gsst, commonKey=(null), extendedLanguageTag= (null), dataType=com.apple.metadata.datatype.UTF-8, time={INVALID}, duration= {INVALID}, startDate=(null), extras={
dataType = 1;
dataTypeNamespace = "com.apple.itunes";
}, value=0>)
blah (1, <AVMetadataItem: 0x15e7e1a50, identifier=itsk/gstd, keySpace=itsk, key class = __NSCFNumber, key=gstd, commonKey=(null), extendedLanguageTag=(null), dataType=com.apple.metadata.datatype.UTF-8, time={INVALID}, duration={INVALID}, startDate=(null), extras={
dataType = 1;
dataTypeNamespace = "com.apple.itunes";
}, value=222980>)
Hopefully it means a lot more to you than it does to me :) What your looking I think is classed/called metadata in Applespeak...
Take a look at this post too
Capture still image from AVCaptureSession in Swift
It describes how-to capture a frame of your video, once you have a frame you can take a closer look at its meta data and I suspect find out some of the details you seek.
let metadata = info[UIImagePickerControllerMediaMetadata] as? NSDictionary
let image = info[UIImagePickerControllerOriginalImage] as? UIImage
Are the commands to try and fetch that... let me know if you manage to succeed!

Related

Swift: Video and image player (pager)

I work on an iOS app that bring videos and images from server, each object have many images and videos, I need to show these images and videos in a slider (pager) where user can slide to get the next one, and I need these videos and images cached in device, I google that and I found this pod but it not play video which is a few seconds infinitely, and I try to play the provided video URL in example of pod and it's also have the same issue, are there any alternative solution or any solution for this pod?
First of all create a UIView outlet for display image or Video then you can use this code `
let avPlayer = AVPlayer()
let videolink = (videosString[indexPath.row] as? String)!
let videoURL = NSURL(string: videolink)
avPlayer = AVPlayer(url: videoURL! as URL)
let playerLayer = AVPlayerLayer(player: avPlayer)
playerLayer.frame = CGRect(x: cell.videoview.frame.origin.x, y: cell.videoview.frame.origin.x, width: cell.videoview.frame.size.width, height: cell.videoview.frame.size.height)
cell.videoview.layer.addSublayer(playerLayer)
avPlayer.play()`
like videoVideo is a view where you wish to play a video(avplayer will add a layer to that view)
To present Image, you can use SKPhotoBrowser
to differentiate Image and Video you can use bool
Hope you got it
If you need any help ask me i can code for you

Compressing an AVAsset (mainly AVMutableComposition)

I have a video with these specs
Format : H.264 , 1280x544
FPS : 25
Data Size : 26MB
Duration : 3:00
Data Rate : 1.17 Mbit/s
While experimenting ,I performed a removeTimeRange(range : CMTimeRange) on every other frame (total frames = 4225). This results in the video becoming 2x faster , so the duration becomes 1:30.
However when I export the video, the video becomes 12x larger in size i.e. 325MB.This makes sense since this technique is decomposing the video into about 2112 pieces and stitching it back together. Apparently, in doing so the compression among individual frames is lost, thus causing the enormous size.
This causes stuttering in the video when played with an AVPlayer and therefore poor performance.
Question : How can I apply some kind of compression while stitching back the frames so that the video can play smoothly and also be less in size?
I only want a point in the right direction. Thanks!
CODE
1) Creating an AVMutableComposition from Asset & Configuring it
func configureAssets(){
let options = [AVURLAssetPreferPreciseDurationAndTimingKey : "true"]
let videoAsset = AVURLAsset(url: Bundle.main.url(forResource: "Push", withExtension: "mp4")! , options : options)
let videoAssetSourceTrack = videoAsset.tracks(withMediaType: AVMediaTypeVideo).first! as AVAssetTrack
let comp = AVMutableComposition()
let videoCompositionTrack = comp.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)
do {
try videoCompositionTrack.insertTimeRange(
CMTimeRangeMake(kCMTimeZero, videoAsset.duration),
of: videoAssetSourceTrack,
at: kCMTimeZero)
deleteSomeFrames(from: comp)
videoCompositionTrack.preferredTransform = videoAssetSourceTrack.preferredTransform
}catch { print(error) }
asset = comp }
2) Deleting every other frame.
func deleteSomeFrames(from asset : AVMutableComposition){
let fps = Int32(asset.tracks(withMediaType: AVMediaTypeVideo).first!.nominalFrameRate)
let sumTime = Int32(asset.duration.value) / asset.duration.timescale;
let totalFrames = sumTime * fps
let totalTime = Float(CMTimeGetSeconds(asset.duration))
let frameDuration = Double(totalTime / Float(totalFrames))
let frameTime = CMTime(seconds: frameDuration, preferredTimescale: 1000)
for frame in Swift.stride(from: 0, to: totalFrames, by: 2){
let timing = CMTimeMultiplyByFloat64(frameTime, Float64(frame))
print("Asset Duration = \(CMTimeGetSeconds(asset.duration))")
print("")
let timeRange = CMTimeRange(start: timing, duration : frameTime)
asset.removeTimeRange(timeRange)
}
print("duration after time removed = \(CMTimeGetSeconds(asset.duration))")
}
3) Saving the file
func createFileFromAsset(_ asset: AVAsset){
let documentsDirectory = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0] as URL
let filePath = documentsDirectory.appendingPathComponent("rendered-vid.mp4")
if let exportSession = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetHighestQuality){
exportSession.outputURL = filePath
exportSession.shouldOptimizeForNetworkUse = true
exportSession.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration)
exportSession.outputFileType = AVFileTypeQuickTimeMovie
exportSession.exportAsynchronously {
print("finished: \(filePath) : \(exportSession.status.rawValue) ")
if exportSession.status.rawValue == 4{
print("Export failed -> Reason: \(exportSession.error!.localizedDescription))")
print(exportSession.error!)
}
}}}
4) Finally update the ViewController to play the new Composition!
override func viewDidLoad() {
super.viewDidLoad()
// Create the AVPlayer and play the composition
assetConfig.configureAssets()
let snapshot : AVComposition = assetConfig.asset as! AVComposition
let playerItem = AVPlayerItem(asset : snapshot)
player = AVPlayer(playerItem: playerItem)
let playerLayer = AVPlayerLayer(player: player)
playerLayer.frame = CGRect(x : 0, y : 0, width : self.view.frame.width , height : self.view.frame.height)
self.view.layer.addSublayer(playerLayer)
player?.play()
}
If you are using AVMutableComposition,you will notice that each composition may contain one or more AVCompositionTrack(or AVMutableCompositionTrack),and the best way to edit your composition was to operate each track, but not the whole composition.
but if your purpose is to faster your video's rate, editing tracks will not be necessary.
so i will try my best to tell you what i know about your question
About video Stuttering while playing
Possible reason of stuttering
notice that you are using method removeTimeRange(range: CMTimeRange),this method will remove the timeRange on composition yes, but will NOT auto fill the empty of each time range
Visualize Example
[F stand for Frame,E stand for Empty]
org_video --> F-F-F-F-F-F-F-F...
after remove time range, the composition will be like this
after_video --> F-E-F-E-F-E-F-E...
and you might think that the video will be like this
target_video --> F-F-F-F...
this is the most possible reason about stuttering during playback.
Suggested solution
So if you want to shorten your video, make it rate more faster/slower,you possible need to use the method scaleTimeRange:(CMTimeRange) toDuration:(CMTime)
Example
AVMutableComposition * project;//if this video is 200s
//Scale
project.scaleTimeRange:CMTimeRangeMake(kCMTimeZero, project.duration) toDuration:CMTimeMakeWithSeconds(100,kCMTimeMaxTimescale)
this method is to make the video faster/slower.
About the file size
a video file's size might effected by bit rate and format type ,if your using H.264,the most possible reason causing size enlarge will be bit rate.
in your code,you are using AVAssetExportSession
AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetHighestQuality
you gave the preset which is AVAssetExportPresetHighestQuality
in my own application project, after i was using this preset, the video's bit rate will be 20~30Mbps,no matter your source video's bit rate. and, well using apple's preset will not allowed you to set the bit rate manually, so.
Possible Solution
There is a third part tool called SDAVAssetExportSession,this session will allowed you to fully config your export session, you might want to try to study this code about custom export session's preset.
here is what i can tell you right now. wish could help :>

AVPlayer not playing local mp4

I am trying to use an AVPlayer to play a video that has been recorded in my app. However, the player won't play the video. I know for a fact that this is a properly recorded mp4 file, because I can take it and play it on my Mac just fine. Here's the setup for the player:
let documents = NSFileManager.defaultManager().URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask).first!
let URL = NSURL(fileURLWithPath: "tempVideo", relativeToURL: documentsDirectory)
let asset = AVAsset(URL: URL)
let item = AVPlayerItem(asset: asset)
//videoPlayer is a property on the view controller being used
videoPlayer = AVPlayer()
//videoPlayerLayer is a property on the view controller being used
videoPlayerLayer = AVPlayerLayer(player: videoPlayer)
videoPlayerLayer.frame.size = view.frame.size
videoPlayerLayer.backgroundColor = UIColor.redColor().CGColor
view.layer.addSublayer(videoPlayerLayer!)
//wait 5 seconds
videoPlayer.play()
I know for sure that the videoPlayer is, in fact, ready to play, because I've checked its status property. I also know that videoPlayerLayer has properly been added to view.layer because its visible and takes up the whole screen. When I call videoPlayer.play(), the music playing on the device stops, but videoPlayerLayer doesn't show anything.
Any ideas? Thank you in advance for the help!
EDIT: I forgot to show that videoPlayerLayer is indeed connected to videoPlayer, I have updated my question to reflect this.
The correct answer was given by #Dershowitz123, but he or she left it in a comment so I can't mark it as correct. The solution was to change the URL to include the .mp4 extension. Thank you for your help.

iOS play video from data URI

I am attempting to play a video by using a data URI (data:video/mp4;base64,AAAAHGZ0eXBtcDQyAAAAAG1wNDJpc29......). Here is my code thus far:
func videoDataWasLoaded(data: NSData) {
let moviePlayer = MPMoviePlayerController()
let base64 = data.base64EncodedStringWithOptions(NSDataBase64EncodingOptions(rawValue: 0))
let dataUri = NSURL(string: "data:video/mp4;base64,\(base64)")
moviePlayer.contentURL = dataUri
moviePlayer.play()
}
I have confirmed that the video plays by writing the data (NSData) to a tmp file and then using that for the contentURL. However, writing to disk is slow, and I figured that the data URI approach would be faster especially since my movie files are small (around 5 seconds each).
UPDATE: This question is not so much concerned about which method (AVPlayer, MPMoviePlayerController) is used to play the video. Rather, it is concerned with the possibility of playing a video from a data URI. Here is a link which describes what I am wanting to do in terms of HTML5.
This code plays a movie from a URL ... assuming that is your question?
let videoURL = self.currentSlide.aURL
self.playerItem = AVPlayerItem(URL: videoURL)
self.player = AVPlayer(playerItem: self.playerItem)
self.playerLayer = AVPlayerLayer(player: self.player)
self.streamPlayer = AVPlayerViewController()
self.streamPlayer.player = self.player
self.streamPlayer.view.frame = CGRect(x: 128, y: 222, width: 512, height: 256)
self.presentViewController(self.streamPlayer, animated: true) {
self.streamPlayer.player!.play()
}
But sorry, that is to play a URL; you want an URI. I though you had mis-typed your question, my error. I looked up URI this time :|
The answer must surely lie in the conversion of your video source to a playable stream, as in an M3u8. Here is an excellent post on the subject it seems.
http://stackoverflow.com/questions/6592485/http-live-streaming

Set MPNowPlayingInfoCenter with other background audio playing

I am trying to play a video using MPMoviePlayerController for an iOS app in Swift.
My goal is to be able to play system music with something like apple music, then open my app and have the audio mix in, but I want my app to be able to take control of MPNowPlayingInfoCenter.
How can I use AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, withOptions: .MixWithOthers) while still set the MPNowPlayingInfoCenter?
Google Maps mixes in audio while taking setting MPNowPlayingInfoCenter. Below is how I am trying to set the MPNowPlayingInfoCenter:
func setMeta(){
UIApplication.sharedApplication().beginReceivingRemoteControlEvents()
self.becomeFirstResponder()
if let player = PlayWorkoutViewController.player{
let coverArt = MPMediaItemArtwork(image: UIImage(named: "AlbumArt")!)
let dict: [String: AnyObject] = [
MPMediaItemPropertyArtwork: coverArt,
MPMediaItemPropertyTitle:workout.title,
MPMediaItemPropertyArtist:"Alex",
MPMediaItemPropertyAlbumTitle:workout.program.title,
MPNowPlayingInfoPropertyPlaybackRate: player.currentPlaybackRate,
MPNowPlayingInfoPropertyElapsedPlaybackTime: player.currentPlaybackTime,
MPMediaItemPropertyPlaybackDuration: player.playableDuration
]
MPNowPlayingInfoCenter.defaultCenter().nowPlayingInfo = dict
}
}
The above function works when I am not trying to play outside music with an option (.MixWithOthers) at the same time, but while I am trying to play outside music with the option (.MixWithOthers) the info center does not update.
Edit 1: Just to make things super clear, I already having video playing properly I am trying to play video with other background audio while being able to set MPNowPlayingInfoCenter.
This isn't currently possible in iOS. Even just changing your category options to .MixWithOthers causes your nowPlayingInfo to be ignored.
My guess is iOS only considers non-mixing apps for inclusion in MPNowPlayingInfoCenter, because there is uncertainty as to which app would show up in (e.g.) Control Center if there are multiple mixing apps playing at the same time.
I'd very much like it if iOS used a best-effort approach for choosing the "now playing app", something like this:
If there's a non-mixing app playing, pick that. Else..
If there's only one mixing app playing, pick that. Else..
If there are multiple mixing apps playing, just pick one :) Or pick none, I'm fine with either.
If you'd like this behavior as well, I'd encourage you to file a bug with Apple.
Have you tried implementing your own custom function to update the MPNowPlayingInfoCenter? Recently I was using an AVAudioPlayer to play music and needed to do the updating manually.
This is basically the function I called upon a new song being loaded.
func updateNowPlayingCenter() {
let center = MPNowPlayingInfoCenter.defaultCenter()
if nowPlayingItem == nil {
center.nowPlayingInfo = nil
} else {
var songInfo = [String: AnyObject]()
// Add item to dictionary if it exists
if let artist = nowPlayingItem?.artist {
songInfo[MPMediaItemPropertyArtist] = artist
}
if let title = nowPlayingItem?.title {
songInfo[MPMediaItemPropertyTitle] = title
}
if let albumTitle = nowPlayingItem?.albumTitle {
songInfo[MPMediaItemPropertyAlbumTitle] = albumTitle
}
if let playbackDuration = nowPlayingItem?.playbackDuration {
songInfo[MPMediaItemPropertyPlaybackDuration] = playbackDuration
}
if let artwork = nowPlayingItem?.artwork {
songInfo[MPMediaItemPropertyArtwork] = artwork
}
center.nowPlayingInfo = songInfo
}
}
I am not sure if doing this upon a movie being loaded will override the MPMoviePlayerController, but it seems worth a shot.
Additionally, they have depreciated MPMoviePlayerController and replaced it with AVPlayerViewController, so thats also worth looking into.
Edit: Also I would check to make sure that you are properly receiving remote control events, as this impacts the data being displayed by the info center.
To play the video in swift use this:-
func playVideoEffect() {
let path = NSBundle.mainBundle().pathForResource("egg_grabberAnmi", ofType:"mp4")
let url = NSURL.fileURLWithPath(path!)
self.moviePlayer = MPMoviePlayerController(contentURL: url)
if let player = moviePlayer {
let screenSize: CGRect = UIScreen.mainScreen().bounds
player.view.frame = CGRect(x: frame.size.width*0.10,y: frame.size.width/2, width:screenSize.width * 0.80, height: screenSize.width * 0.80)
player.view.sizeToFit()
player.scalingMode = MPMovieScalingMode.Fill
player.fullscreen = true
player.controlStyle = MPMovieControlStyle.None
player.movieSourceType = MPMovieSourceType.File
player.play()
self.view?.addSubview(player.view)
var timer = NSTimer.scheduledTimerWithTimeInterval(6.0, target: self, selector: Selector("update"), userInfo: nil, repeats: false)
}
}
// And Use the function to play Video
playVideoEffect()

Resources