VLCMediaPlayer has no member repeatMode While VLCMediaListPlayer has repeat mode.
I'm currently setting as
mediaPlayer.repeatMode = VLCRepeatMode.repeatCurrentItem
You need to create a mediaList and play the first media
let media = VLCMedia(url: url)
let mediaList = VLCMediaList()
mediaList.add(media)
let mediaListPlayer = VLCMediaListPlayer(drawable: mediaPlayerView)
mediaListPlayer.mediaList = mediaList
mediaListPlayer.repeatMode = .repeatCurrentItem
mediaListPlayer.play(media)
mediaPlayerView is the view where de video will play
Related
I need to use Replaykit (Broadcast extension UI) to be able to cast content from iPhone to TV (Chrome cast).
Currently I am using Haishinkit library, I have written content to HTTPStream (CMSampleBuffer), I use this URL and cast to TV, it doesn't work.
let url = URL.init(string: "abc.m38u")!
let mediaInfoBuilder = GCKMediaInformationBuilder.init(contentURL: url)
mediaInfoBuilder.streamType = GCKMediaStreamType.buffered
mediaInfoBuilder.contentID = mediaURL.absoluteString
mediaInfoBuilder.contentType = mediaURL.mimeType()
mediaInfoBuilder.hlsSegmentFormat = .TS
mediaInfoBuilder.hlsVideoSegmentFormat = .MPEG2_TS
mediaInfoBuilder.streamDuration = .infinity
Where am I going wrong.
Can I use any other way to stream content to Chromecast, because using HTTPStream, the content is deleyed for about 5 to 10 seconds.
Thanks.
I'm setting the title and description as metadata for an AVPlayer video in tvOS.
How can set the player duration in metadata?
the info overlay should automatically display the correct duration. if not, you can try this:
let duration = player?.currentItem?.duration
let titleItem = AVMutableMetadataItem()
titleItem.key = AVMetadataCommonKeyTitle
titleItem.keySpace = AVMetadataKeySpaceCommon
titleItem.locale = NSLocale.currentLocale()
titleItem.value = "My Video"
titleItem.duration = duration
I am currently working on HTTP Live streaming video with AVPlayerViewController / AVPlayer
i am play a video with .m38u file supported
It playing fine but my question is that can i have the data of video like i have set 4 type of resolution while genrating the .m38u file.. , i can varies the resoltion at my end point now the question is how to get the all values which i have setup at my endpoint. I am play this is in android also and using Track i am able to fetch all the video information but in ios how can i fetch all the details containing the video like its height , with, track,resolution supported by video etc..
I have search alot but could not get succeed..
Need help
Thanks in advance
Anita, I start I hope here is slice of code for playing a VIDEO..
self.command.text = "Loading VIDEO"
let videoURL = self.currentSlide.aURL
self.playerItem = AVPlayerItem(URL: videoURL)
self.player = AVPlayer(playerItem: self.playerItem)
self.playerLayer = AVPlayerLayer(player: self.player)
self.streamPlayer = AVPlayerViewController()
self.streamPlayer.player = self.player
self.streamPlayer.view.frame = CGRect(x: 128, y: 222, width: 512, height: 256)
let currentFM = self.streamPlayer.player?.currentItem?.asset
for blah in (currentFM?.metadata.enumerate())! {
print("blah \(blah)")
}
self.presentViewController(self.streamPlayer, animated: true)
self.streamPlayer.player!.play()
}
I added a little extra for showing meta data about the video... it printed...
blah (0, <AVMetadataItem: 0x15e7e4280, identifier=itsk/gsst, keySpace=itsk, key class = __NSCFNumber, key=gsst, commonKey=(null), extendedLanguageTag= (null), dataType=com.apple.metadata.datatype.UTF-8, time={INVALID}, duration= {INVALID}, startDate=(null), extras={
dataType = 1;
dataTypeNamespace = "com.apple.itunes";
}, value=0>)
blah (1, <AVMetadataItem: 0x15e7e1a50, identifier=itsk/gstd, keySpace=itsk, key class = __NSCFNumber, key=gstd, commonKey=(null), extendedLanguageTag=(null), dataType=com.apple.metadata.datatype.UTF-8, time={INVALID}, duration={INVALID}, startDate=(null), extras={
dataType = 1;
dataTypeNamespace = "com.apple.itunes";
}, value=222980>)
Hopefully it means a lot more to you than it does to me :) What your looking I think is classed/called metadata in Applespeak...
Take a look at this post too
Capture still image from AVCaptureSession in Swift
It describes how-to capture a frame of your video, once you have a frame you can take a closer look at its meta data and I suspect find out some of the details you seek.
let metadata = info[UIImagePickerControllerMediaMetadata] as? NSDictionary
let image = info[UIImagePickerControllerOriginalImage] as? UIImage
Are the commands to try and fetch that... let me know if you manage to succeed!
I am trying to play a video using MPMoviePlayerController for an iOS app in Swift.
My goal is to be able to play system music with something like apple music, then open my app and have the audio mix in, but I want my app to be able to take control of MPNowPlayingInfoCenter.
How can I use AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, withOptions: .MixWithOthers) while still set the MPNowPlayingInfoCenter?
Google Maps mixes in audio while taking setting MPNowPlayingInfoCenter. Below is how I am trying to set the MPNowPlayingInfoCenter:
func setMeta(){
UIApplication.sharedApplication().beginReceivingRemoteControlEvents()
self.becomeFirstResponder()
if let player = PlayWorkoutViewController.player{
let coverArt = MPMediaItemArtwork(image: UIImage(named: "AlbumArt")!)
let dict: [String: AnyObject] = [
MPMediaItemPropertyArtwork: coverArt,
MPMediaItemPropertyTitle:workout.title,
MPMediaItemPropertyArtist:"Alex",
MPMediaItemPropertyAlbumTitle:workout.program.title,
MPNowPlayingInfoPropertyPlaybackRate: player.currentPlaybackRate,
MPNowPlayingInfoPropertyElapsedPlaybackTime: player.currentPlaybackTime,
MPMediaItemPropertyPlaybackDuration: player.playableDuration
]
MPNowPlayingInfoCenter.defaultCenter().nowPlayingInfo = dict
}
}
The above function works when I am not trying to play outside music with an option (.MixWithOthers) at the same time, but while I am trying to play outside music with the option (.MixWithOthers) the info center does not update.
Edit 1: Just to make things super clear, I already having video playing properly I am trying to play video with other background audio while being able to set MPNowPlayingInfoCenter.
This isn't currently possible in iOS. Even just changing your category options to .MixWithOthers causes your nowPlayingInfo to be ignored.
My guess is iOS only considers non-mixing apps for inclusion in MPNowPlayingInfoCenter, because there is uncertainty as to which app would show up in (e.g.) Control Center if there are multiple mixing apps playing at the same time.
I'd very much like it if iOS used a best-effort approach for choosing the "now playing app", something like this:
If there's a non-mixing app playing, pick that. Else..
If there's only one mixing app playing, pick that. Else..
If there are multiple mixing apps playing, just pick one :) Or pick none, I'm fine with either.
If you'd like this behavior as well, I'd encourage you to file a bug with Apple.
Have you tried implementing your own custom function to update the MPNowPlayingInfoCenter? Recently I was using an AVAudioPlayer to play music and needed to do the updating manually.
This is basically the function I called upon a new song being loaded.
func updateNowPlayingCenter() {
let center = MPNowPlayingInfoCenter.defaultCenter()
if nowPlayingItem == nil {
center.nowPlayingInfo = nil
} else {
var songInfo = [String: AnyObject]()
// Add item to dictionary if it exists
if let artist = nowPlayingItem?.artist {
songInfo[MPMediaItemPropertyArtist] = artist
}
if let title = nowPlayingItem?.title {
songInfo[MPMediaItemPropertyTitle] = title
}
if let albumTitle = nowPlayingItem?.albumTitle {
songInfo[MPMediaItemPropertyAlbumTitle] = albumTitle
}
if let playbackDuration = nowPlayingItem?.playbackDuration {
songInfo[MPMediaItemPropertyPlaybackDuration] = playbackDuration
}
if let artwork = nowPlayingItem?.artwork {
songInfo[MPMediaItemPropertyArtwork] = artwork
}
center.nowPlayingInfo = songInfo
}
}
I am not sure if doing this upon a movie being loaded will override the MPMoviePlayerController, but it seems worth a shot.
Additionally, they have depreciated MPMoviePlayerController and replaced it with AVPlayerViewController, so thats also worth looking into.
Edit: Also I would check to make sure that you are properly receiving remote control events, as this impacts the data being displayed by the info center.
To play the video in swift use this:-
func playVideoEffect() {
let path = NSBundle.mainBundle().pathForResource("egg_grabberAnmi", ofType:"mp4")
let url = NSURL.fileURLWithPath(path!)
self.moviePlayer = MPMoviePlayerController(contentURL: url)
if let player = moviePlayer {
let screenSize: CGRect = UIScreen.mainScreen().bounds
player.view.frame = CGRect(x: frame.size.width*0.10,y: frame.size.width/2, width:screenSize.width * 0.80, height: screenSize.width * 0.80)
player.view.sizeToFit()
player.scalingMode = MPMovieScalingMode.Fill
player.fullscreen = true
player.controlStyle = MPMovieControlStyle.None
player.movieSourceType = MPMovieSourceType.File
player.play()
self.view?.addSubview(player.view)
var timer = NSTimer.scheduledTimerWithTimeInterval(6.0, target: self, selector: Selector("update"), userInfo: nil, repeats: false)
}
}
// And Use the function to play Video
playVideoEffect()
I am combining two audio files into one. I set up two sliders to change volume of each audio file. When i try to do preferredVolume for an AVAssetTrack i get this (#lvalue Float) -> $T5 is not identical to float. Are there any other ways to accomplish this? Code is in swift but i dont mind if answer is in objective c.
EDIT: How can i change the volume of each audio file with a slider or with a float?
Code:
let type = AVMediaTypeAudio
let asset1 = AVURLAsset(URL: beatLocationURL, options: nil)
let arr2 = asset1.tracksWithMediaType(type)
let track2 = arr2.last as AVAssetTrack
track2.preferredVolume(beatVolume.value) <--where error occurs
let duration : CMTime = track2.timeRange.duration
let comp = AVMutableComposition()
let comptrack = comp.addMutableTrackWithMediaType(type,
preferredTrackID: Int32(kCMPersistentTrackID_Invalid))
comptrack.insertTimeRange(CMTimeRangeMake(CMTimeMakeWithSeconds(0,600), CMTimeMakeWithSeconds(5,600)), ofTrack:track2, atTime:CMTimeMakeWithSeconds(0,600), error:nil)
comptrack.insertTimeRange(CMTimeRangeMake(CMTimeSubtract(duration, CMTimeMakeWithSeconds(5,600)), CMTimeMakeWithSeconds(5,600)), ofTrack:track2, atTime:CMTimeMakeWithSeconds(5,600), error:nil)
let type3 = AVMediaTypeAudio
let asset = AVURLAsset(URL: vocalURL, options:nil)
let arr3 = asset.tracksWithMediaType(type3)
let track3 = arr3.last as AVAssetTrack
let comptrack3 = comp.addMutableTrackWithMediaType(type3, preferredTrackID:Int32(kCMPersistentTrackID_Invalid))
comptrack3.insertTimeRange(CMTimeRangeMake(CMTimeMakeWithSeconds(0,600), CMTimeMakeWithSeconds(10,600)), ofTrack:track3, atTime:CMTimeMakeWithSeconds(0,600), error:nil)
let params = AVMutableAudioMixInputParameters(track:comptrack3)
params.setVolume(1, atTime:CMTimeMakeWithSeconds(0,600))
params.setVolumeRampFromStartVolume(1, toEndVolume:0, timeRange:CMTimeRangeMake(CMTimeMakeWithSeconds(7,600), CMTimeMakeWithSeconds(3,600)))
let mix = AVMutableAudioMix()
mix.inputParameters = [params]
let item = AVPlayerItem(asset:comp)
item.audioMix = mix
You can't say this:
track2.preferredVolume(beatVolume.value)
If you want to set track2.preferredVolume, then say:
track2.preferredVolume = beatVolume.value
Of course, that will only work if beatVolume.value is a Float. If it isn't, you will have to make a Float out of it somehow.
Also, you won't be able to set the preferredVolume of track2, because it's an AVAssetTrack. An AVAssetTrack's preferredVolume isn't settable. What you want to do is wait until you're setting up your AVMutableComposition and set the volumes on its tracks. For example:
comptrack3.preferredVolume = 0.5
That will compile, and now you can figure out how to substitute some other Float as the real value. (It will not, however, change the volume of one track relative to another. If that's your goal, use an AVMutableAudioMix. See, for example, Apple's sample code here: https://developer.apple.com/library/ios/qa/qa1716/_index.html.)