tvOS video: How to set duration of video on metadata - ios

I'm setting the title and description as metadata for an AVPlayer video in tvOS.
How can set the player duration in metadata?

the info overlay should automatically display the correct duration. if not, you can try this:
let duration = player?.currentItem?.duration
let titleItem = AVMutableMetadataItem()
titleItem.key = AVMetadataCommonKeyTitle
titleItem.keySpace = AVMetadataKeySpaceCommon
titleItem.locale = NSLocale.currentLocale()
titleItem.value = "My Video"
titleItem.duration = duration

Related

Video's 'Artist' metadata not showing up on OS media player when played through AVPlayerViewController

When playing a video through AVPlayerViewController, its artist metadata does not show up in the player below the title in iOS control center / Lock Screen player.
Right now, I'm setting metadata to the AVPlayerItem itself in a function within an AVPlayerItem extension, along the lines of this:
func setMetadata(title: String, artist: String) {
let titleItem = AVMutableMetadataItem()
titleItem.identifier = AVMetadataIdentifier.commonIdentifierTitle
titleItem.value = title as (NSCopying & NSObjectProtocol)
self.externalMetadata.append(titleItem)
let artistItem = AVMutableMetadataItem()
artistItem.identifier = AVMetadataIdentifier.commonIdentifierArtist
artistItem.value = artist as (NSCopying & NSObjectProtocol)
self.externalMetadata.append(artistItem)
}
The title item works properly, but the artist is not updating. Any thoughts? Do videos have some other metadata field shown in the player that isn't artist?
I've tried .commonIdentifierDescription and .iTunesMetadataTrackSubtitle as alternate metadata identifiers but these are not working as expected.

Using ReplayKit & HTTPStream and cast to Chromecast

I need to use Replaykit (Broadcast extension UI) to be able to cast content from iPhone to TV (Chrome cast).
Currently I am using Haishinkit library, I have written content to HTTPStream (CMSampleBuffer), I use this URL and cast to TV, it doesn't work.
let url = URL.init(string: "abc.m38u")!
let mediaInfoBuilder = GCKMediaInformationBuilder.init(contentURL: url)
mediaInfoBuilder.streamType = GCKMediaStreamType.buffered
mediaInfoBuilder.contentID = mediaURL.absoluteString
mediaInfoBuilder.contentType = mediaURL.mimeType()
mediaInfoBuilder.hlsSegmentFormat = .TS
mediaInfoBuilder.hlsVideoSegmentFormat = .MPEG2_TS
mediaInfoBuilder.streamDuration = .infinity
Where am I going wrong.
Can I use any other way to stream content to Chromecast, because using HTTPStream, the content is deleyed for about 5 to 10 seconds.
Thanks.

How can I set a video in a loop in MobileVLCKit?

VLCMediaPlayer has no member repeatMode While VLCMediaListPlayer has repeat mode.
I'm currently setting as
mediaPlayer.repeatMode = VLCRepeatMode.repeatCurrentItem
You need to create a mediaList and play the first media
let media = VLCMedia(url: url)
let mediaList = VLCMediaList()
mediaList.add(media)
let mediaListPlayer = VLCMediaListPlayer(drawable: mediaPlayerView)
mediaListPlayer.mediaList = mediaList
mediaListPlayer.repeatMode = .repeatCurrentItem
mediaListPlayer.play(media)
mediaPlayerView is the view where de video will play

Set MPNowPlayingInfoCenter with other background audio playing

I am trying to play a video using MPMoviePlayerController for an iOS app in Swift.
My goal is to be able to play system music with something like apple music, then open my app and have the audio mix in, but I want my app to be able to take control of MPNowPlayingInfoCenter.
How can I use AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, withOptions: .MixWithOthers) while still set the MPNowPlayingInfoCenter?
Google Maps mixes in audio while taking setting MPNowPlayingInfoCenter. Below is how I am trying to set the MPNowPlayingInfoCenter:
func setMeta(){
UIApplication.sharedApplication().beginReceivingRemoteControlEvents()
self.becomeFirstResponder()
if let player = PlayWorkoutViewController.player{
let coverArt = MPMediaItemArtwork(image: UIImage(named: "AlbumArt")!)
let dict: [String: AnyObject] = [
MPMediaItemPropertyArtwork: coverArt,
MPMediaItemPropertyTitle:workout.title,
MPMediaItemPropertyArtist:"Alex",
MPMediaItemPropertyAlbumTitle:workout.program.title,
MPNowPlayingInfoPropertyPlaybackRate: player.currentPlaybackRate,
MPNowPlayingInfoPropertyElapsedPlaybackTime: player.currentPlaybackTime,
MPMediaItemPropertyPlaybackDuration: player.playableDuration
]
MPNowPlayingInfoCenter.defaultCenter().nowPlayingInfo = dict
}
}
The above function works when I am not trying to play outside music with an option (.MixWithOthers) at the same time, but while I am trying to play outside music with the option (.MixWithOthers) the info center does not update.
Edit 1: Just to make things super clear, I already having video playing properly I am trying to play video with other background audio while being able to set MPNowPlayingInfoCenter.
This isn't currently possible in iOS. Even just changing your category options to .MixWithOthers causes your nowPlayingInfo to be ignored.
My guess is iOS only considers non-mixing apps for inclusion in MPNowPlayingInfoCenter, because there is uncertainty as to which app would show up in (e.g.) Control Center if there are multiple mixing apps playing at the same time.
I'd very much like it if iOS used a best-effort approach for choosing the "now playing app", something like this:
If there's a non-mixing app playing, pick that. Else..
If there's only one mixing app playing, pick that. Else..
If there are multiple mixing apps playing, just pick one :) Or pick none, I'm fine with either.
If you'd like this behavior as well, I'd encourage you to file a bug with Apple.
Have you tried implementing your own custom function to update the MPNowPlayingInfoCenter? Recently I was using an AVAudioPlayer to play music and needed to do the updating manually.
This is basically the function I called upon a new song being loaded.
func updateNowPlayingCenter() {
let center = MPNowPlayingInfoCenter.defaultCenter()
if nowPlayingItem == nil {
center.nowPlayingInfo = nil
} else {
var songInfo = [String: AnyObject]()
// Add item to dictionary if it exists
if let artist = nowPlayingItem?.artist {
songInfo[MPMediaItemPropertyArtist] = artist
}
if let title = nowPlayingItem?.title {
songInfo[MPMediaItemPropertyTitle] = title
}
if let albumTitle = nowPlayingItem?.albumTitle {
songInfo[MPMediaItemPropertyAlbumTitle] = albumTitle
}
if let playbackDuration = nowPlayingItem?.playbackDuration {
songInfo[MPMediaItemPropertyPlaybackDuration] = playbackDuration
}
if let artwork = nowPlayingItem?.artwork {
songInfo[MPMediaItemPropertyArtwork] = artwork
}
center.nowPlayingInfo = songInfo
}
}
I am not sure if doing this upon a movie being loaded will override the MPMoviePlayerController, but it seems worth a shot.
Additionally, they have depreciated MPMoviePlayerController and replaced it with AVPlayerViewController, so thats also worth looking into.
Edit: Also I would check to make sure that you are properly receiving remote control events, as this impacts the data being displayed by the info center.
To play the video in swift use this:-
func playVideoEffect() {
let path = NSBundle.mainBundle().pathForResource("egg_grabberAnmi", ofType:"mp4")
let url = NSURL.fileURLWithPath(path!)
self.moviePlayer = MPMoviePlayerController(contentURL: url)
if let player = moviePlayer {
let screenSize: CGRect = UIScreen.mainScreen().bounds
player.view.frame = CGRect(x: frame.size.width*0.10,y: frame.size.width/2, width:screenSize.width * 0.80, height: screenSize.width * 0.80)
player.view.sizeToFit()
player.scalingMode = MPMovieScalingMode.Fill
player.fullscreen = true
player.controlStyle = MPMovieControlStyle.None
player.movieSourceType = MPMovieSourceType.File
player.play()
self.view?.addSubview(player.view)
var timer = NSTimer.scheduledTimerWithTimeInterval(6.0, target: self, selector: Selector("update"), userInfo: nil, repeats: false)
}
}
// And Use the function to play Video
playVideoEffect()

iOS: Get the length of the song from parse

I have uploaded few files on parse framework and I am live streaming those files and play them in AVPlayer.
let url = currentAudioPath
let playerItem = AVPlayerItem( URL:NSURL( string:url ) )
player1 = AVPlayer(playerItem:playerItem)
player1.rate = 1.0;
self.configurePlayer()
player1.play()
I have not downloaded the entire file at once. But is there a way to retrieve the length of the song?
"AVPlayerItem" has a ".duration" property that might help you.
Look at the notes though: You won't be able to get the duration "until the status of the AVPlayerItem is AVPlayerItemStatusReadyToPlay." You could also set a KVO on the property so you could update your UI once the item duration is determined by the AVFoundation layer.

Resources