I use ijkplayer to play the live video in my ios app.
Here is the code for player initialization
let options = IJKFFOptions.byDefault()
let streamUrl:String = "http://xxxxxxxxx.com/test/7777.flv"
let player = IJKFFMoviePlayerController(contentURLString: streamUrl, with: options)
let autoresize = UIViewAutoresizing.flexibleWidth.rawValue | UIViewAutoresizing.flexibleHeight.rawValue
player?.view.autoresizingMask = UIViewAutoresizing(rawValue: autoresize)
player?.view.frame = CGRect(x: 0, y: 0, width: 1, height: 1)
let gesture = UITapGestureRecognizer(target: self, action: #selector(self.compressExpand(_:)))
player?.view.addGestureRecognizer(gesture)
player?.view.backgroundColor = UIColor.black
player?.scalingMode = .aspectFit
player?.shouldAutoplay = false
self.player = player
self.player.shouldShowHudView = true
self.player.setPlayerOptionIntValue(1000, forKey: "max_cached_duration")
The issue is I use player.pause() to resume the player for some time. Then I use player.play() to resume playing. Live video then playing from the frame where paused, what I want is to resume the video from the latest or newest frame.
You can release the player, and initiate it again. This should play the demo from the latest frame. This may be not the best solution.
Related
I have an app which plays video clips on demand. This has worked well in previous versions of Xcode but I've upgraded to 8.3.3 and am having problems with the AVPlayerViewController.
The video plays and displays correctly. The control bar appears at the bottom of the view but does not respond to taps and, once faded out, does not reappear on tapping the video - unless, that is, I tap near the top left of the view.
My guess is that the actual controls are hidden in some kind of overlay which has the wrong size i.e. it does not properly overlay the whole of the video view. Is there some way to force the AVPlayerViewController to relayout?
I've tried adding:
_playerController!.view.setNeedsLayout()
_playerController!.view.layoutIfNeeded()
But this has no effect.
Here's my code:
override func viewDidLoad() {
super.viewDidLoad()
_player = AVPlayer()
_playerController = AVPlayerViewController()
_playerController!.showsPlaybackControls = true
_playerController!.allowsPictureInPicturePlayback = false
_playerController!.videoGravity = AVLayerVideoGravityResizeAspectFill
_playerController!.player = _player
self.addChildViewController(_playerController!)
videoView.addSubview(_playerController!.view)
...
override func viewDidLayoutSubviews() {
let frame = UIScreen.main.bounds
_vidWidth = frame.width - 256.0
_vidHeight = _vidWidth / 854.0 * 480.0
videoView.frame = CGRect(x: 0, y: 10.0, width: _vidWidth, height: _vidHeight)
videoView.backgroundColor = UIColor.black
_playerController?.view.frame = CGRect(x: 0, y: 0, width: _vidWidth, height: _vidHeight)
...
func playVideo(_ clip: Clip) {
var videoUrl: URL? = nil
if clip.offlineLocation != nil && clip.state == 2 {
_videoPath = clip.offlineLocation!
videoUrl = URL(fileURLWithPath: _videoPath!)
}
else {
_videoPath = "https://xxxx.cloudfront.net/" + clip.location
videoUrl = URL(string: _videoPath!)
}
NSLog(_videoPath!)
let asset = AVURLAsset(url:videoUrl!, options:nil)
let playerItem = AVPlayerItem(asset: asset)
_player!.replaceCurrentItem(with: playerItem)
_player!.play()
}
I want to show a video of a topic in top half of the view and its matter in textview in the bottom half of the view. For that video control i want to have the features like play,pause,stop,ff etc. Also i want to play it from local resource as my web services hasn't been setup yet. pls suggest a good solution
I have tried UIWebView and added constraints to webview and textview but for some reason the web view is not showing the video correctly. below is my code
let purl = NSURL(fileURLWithPath: "/Users/Rohit/Desktop/videos/demo/demo/video1.mp4") webView.loadHTMLString("<iframe width = \" \(webView.frame.width) \" height = \"\(webView.frame.height)\" src = \"\(purl)\"></iframe>", baseURL: nil)
webView.backgroundColor = UIColor.green
webView.mediaPlaybackRequiresUserAction = true
webView.scrollView.isScrollEnabled = true
webView.isUserInteractionEnabled = true
Import AVFoundation and AVKit
Then play the video using an URL object (in Swift 3 NSURL is renamed to URL)
let player = AVPlayer(URL: URI)
let controller = AVPlayerViewController()
controller.player = player
self.addChildViewController(controller)
let screenSize = UIScreen.main.bounds.size
let videoFrame = CGRect(x: 0, y: 10, width: screenSize.width, height: (screenSize.height - 10) * 0.5)
controller.view.frame = videoFrame
self.view.addSubview(controller.view)
player.play()
You can use AVPlayerLayer and give it bounds.
private func inits() {
//let rootLayer: CALayer = self.layer
// rootLayer.masksToBounds = true
avPlayerLayer = AVPlayerLayer(player: player)
avPlayerLayer.bounds = self.bounds
// avPlayerLayer.backgroundColor = UIColor.yellowColor().CGColor
self.layer.insertSublayer(avPlayerLayer, atIndex: 0)
}
I'm trying to show a video in augmented reality using Vuforia - but for the sake of this question, just showing the scene and video would be fine.
What's expected:
Show the video (playing) at the correct speed for video and audio and have them both in sync.
What's happening:
Audio plays at correct speed. Video plays at a seriously fast speed - like 10x.
Tried:
I've tried changing the rate - it's ignored completely.
I've tried using different ways (AVPlayer, AVPlayerLayer,
SKVideoNode(withURL)) of putting the video into the scene - all
suffer from hyperactive-video-syndrome
I've tried other file formats - nope
I've tried local files and URL - no dice
I've tried throwing my laptop at a wall - it made the video go away
Code to return a the scene with the video:
private func createVideoScene(with view: VuforiaEAGLView) -> SCNScene {
// create the asset & player and grab the dimensions
let asset = AVAsset(URL: NSURL(string: "https://inm-baobab-prod-eu-west-1.s3.amazonaws.com/public/inm/media/video/2016/09/02/61537094SansSouciGirlsSchool.mp4")!)
let size = asset.tracksWithMediaType(AVMediaTypeVideo)[0].naturalSize
let player = AVPlayer(playerItem: AVPlayerItem(asset: asset))
let videoNode = SKVideoNode(AVPlayer: player)
videoNode.size = size
videoNode.position = CGPoint(x: size.width * 0.5, y: size.height * 0.5)
let videoScene = SKScene(size: size)
videoScene.addChild(videoNode)
let videoWrapperNode = SCNNode(geometry: SCNPlane(width: 10, height: 8))
videoWrapperNode.position = SCNVector3(x: 0, y: 0, z: 0)
videoWrapperNode.geometry?.firstMaterial?.diffuse.contents = videoScene
videoWrapperNode.geometry?.firstMaterial?.doubleSided = true
videoWrapperNode.scale.y = -1
videoWrapperNode.name = "video"
let scene = SCNScene()
scene.rootNode.addChildNode(videoWrapperNode)
return scene
}
Thank you
PS. Help in Objective-C is also welcome :)
I created a Video Player using AVPlayer and AVPlayerViewController. I have set "allowsExternalPlayback" property to true and also "usesExternalPlaybackWhileExternalScreenIsActive" property to true. But still I am not getting Airplay Icon in Player Controls.
player = AVPlayer(URL: url!)
player!.allowsExternalPlayback = true
player?.usesExternalPlaybackWhileExternalScreenIsActive = true
I am running my app on ios 9.2.
You need to add an MPVolumeView in order to get this. You can read about this here: https://developer.apple.com/library/ios/documentation/AudioVideo/Conceptual/AirPlayGuide/EnrichYourAppforAirPlay/EnrichYourAppforAirPlay.html
func appleTv()
{
let rect = CGRect(x: -100, y: 0, width: 0, height: 0)
let airplayVolume = MPVolumeView(frame: rect)
airplayVolume.showsVolumeSlider = false
self.view.addSubview(airplayVolume)
for view: UIView in airplayVolume.subviews {
if let button = view as? UIButton {
button.sendActions(for: .touchUpInside)
break
}
}
airplayVolume.removeFromSuperview()
}
I'm trying to play a video in a scrollView. For that matter, I ended up using an AVPlayerViewController because it worked really well with the spacing between multiple videos. The problem is however, that all the videos are approx. 1/3 bigger than the size of the screen. What did I do wrong?
let player = AVPlayer(URL: NSURL(string: videoLink))
let playerController = AVPlayerViewController()
playerController.player = player
playerController.showsPlaybackControls = false
playerController.videoGravity = AVLayerVideoGravityResizeAspect
playerController.view.frame = CGRectMake(0, (UIScreen.mainScreen().bounds.height) * CGFloat(index), UIScreen.mainScreen().bounds.width,0)
UPDATE:
The following code still shows the video. The frame is really small, but still shows. I tried debugging it by printing out its "view.frame.bounds.width", "view.frame.bounds.size.width", "view.frame.size.width", and "view.frame.width" and they all said "0"
playerController.view.frame = CGRectMake(0, (self.view.frame.size.height - 64) * CGFloat(index) + 30, 0, 0)