Issues with vertical video orientation in SKVideoNode using Swift - ios

The following code shows my video file in correct zPosition with the other elements I'm working with, creating a background video.
The problem I'm having is that the vertical video (1080x1920 pixels) gets rotated 90 degrees counterclockwise, and is stretched to fit as a landscape video. How can I ensure correct orientation without sacrificing my need to use the SKVideoNode with zPosition?
let videoNode: SKVideoNode? = {
guard let urlString = Bundle.main.path(forResource: "merry", ofType: "mov") else {
return nil
}
let url = URL(fileURLWithPath: urlString)
let item = AVPlayerItem(url: url)
player = AVPlayer(playerItem: item)
return SKVideoNode(avPlayer: player)
}()
videoNode?.position = CGPoint( x: frame.midX, y: frame.midY)
videoNode?.size = self.frame.size
videoNode?.zPosition = 20
addChild((videoNode)!)
player.play()
player.volume = 0
Many thanks!

Got there in the end with a workaround:
// fix to rotate vertical video by 90 degrees and resize to fit....
videoNode?.zRotation = CGFloat(-Double.pi) * 90 / 180
videoNode?.size.width = self.frame.size.height
videoNode?.size.height = self.frame.size.width

Related

AVPlayerViewController controls not working

I have an app which plays video clips on demand. This has worked well in previous versions of Xcode but I've upgraded to 8.3.3 and am having problems with the AVPlayerViewController.
The video plays and displays correctly. The control bar appears at the bottom of the view but does not respond to taps and, once faded out, does not reappear on tapping the video - unless, that is, I tap near the top left of the view.
My guess is that the actual controls are hidden in some kind of overlay which has the wrong size i.e. it does not properly overlay the whole of the video view. Is there some way to force the AVPlayerViewController to relayout?
I've tried adding:
_playerController!.view.setNeedsLayout()
_playerController!.view.layoutIfNeeded()
But this has no effect.
Here's my code:
override func viewDidLoad() {
super.viewDidLoad()
_player = AVPlayer()
_playerController = AVPlayerViewController()
_playerController!.showsPlaybackControls = true
_playerController!.allowsPictureInPicturePlayback = false
_playerController!.videoGravity = AVLayerVideoGravityResizeAspectFill
_playerController!.player = _player
self.addChildViewController(_playerController!)
videoView.addSubview(_playerController!.view)
...
override func viewDidLayoutSubviews() {
let frame = UIScreen.main.bounds
_vidWidth = frame.width - 256.0
_vidHeight = _vidWidth / 854.0 * 480.0
videoView.frame = CGRect(x: 0, y: 10.0, width: _vidWidth, height: _vidHeight)
videoView.backgroundColor = UIColor.black
_playerController?.view.frame = CGRect(x: 0, y: 0, width: _vidWidth, height: _vidHeight)
...
func playVideo(_ clip: Clip) {
var videoUrl: URL? = nil
if clip.offlineLocation != nil && clip.state == 2 {
_videoPath = clip.offlineLocation!
videoUrl = URL(fileURLWithPath: _videoPath!)
}
else {
_videoPath = "https://xxxx.cloudfront.net/" + clip.location
videoUrl = URL(string: _videoPath!)
}
NSLog(_videoPath!)
let asset = AVURLAsset(url:videoUrl!, options:nil)
let playerItem = AVPlayerItem(asset: asset)
_player!.replaceCurrentItem(with: playerItem)
_player!.play()
}

how to show video in a small frame in ios

I want to show a video of a topic in top half of the view and its matter in textview in the bottom half of the view. For that video control i want to have the features like play,pause,stop,ff etc. Also i want to play it from local resource as my web services hasn't been setup yet. pls suggest a good solution
I have tried UIWebView and added constraints to webview and textview but for some reason the web view is not showing the video correctly. below is my code
let purl = NSURL(fileURLWithPath: "/Users/Rohit/Desktop/videos/demo/demo/video1.mp4") webView.loadHTMLString("<iframe width = \" \(webView.frame.width) \" height = \"\(webView.frame.height)\" src = \"\(purl)\"></iframe>", baseURL: nil)
webView.backgroundColor = UIColor.green
webView.mediaPlaybackRequiresUserAction = true
webView.scrollView.isScrollEnabled = true
webView.isUserInteractionEnabled = true
Import AVFoundation and AVKit
Then play the video using an URL object (in Swift 3 NSURL is renamed to URL)
let player = AVPlayer(URL: URI)
let controller = AVPlayerViewController()
controller.player = player
self.addChildViewController(controller)
let screenSize = UIScreen.main.bounds.size
let videoFrame = CGRect(x: 0, y: 10, width: screenSize.width, height: (screenSize.height - 10) * 0.5)
controller.view.frame = videoFrame
self.view.addSubview(controller.view)
player.play()
You can use AVPlayerLayer and give it bounds.
private func inits() {
//let rootLayer: CALayer = self.layer
// rootLayer.masksToBounds = true
avPlayerLayer = AVPlayerLayer(player: player)
avPlayerLayer.bounds = self.bounds
// avPlayerLayer.backgroundColor = UIColor.yellowColor().CGColor
self.layer.insertSublayer(avPlayerLayer, atIndex: 0)
}

SCNVideoNode playing extremely fast

I'm trying to show a video in augmented reality using Vuforia - but for the sake of this question, just showing the scene and video would be fine.
What's expected:
Show the video (playing) at the correct speed for video and audio and have them both in sync.
What's happening:
Audio plays at correct speed. Video plays at a seriously fast speed - like 10x.
Tried:
I've tried changing the rate - it's ignored completely.
I've tried using different ways (AVPlayer, AVPlayerLayer,
SKVideoNode(withURL)) of putting the video into the scene - all
suffer from hyperactive-video-syndrome
I've tried other file formats - nope
I've tried local files and URL - no dice
I've tried throwing my laptop at a wall - it made the video go away
Code to return a the scene with the video:
private func createVideoScene(with view: VuforiaEAGLView) -> SCNScene {
// create the asset & player and grab the dimensions
let asset = AVAsset(URL: NSURL(string: "https://inm-baobab-prod-eu-west-1.s3.amazonaws.com/public/inm/media/video/2016/09/02/61537094SansSouciGirlsSchool.mp4")!)
let size = asset.tracksWithMediaType(AVMediaTypeVideo)[0].naturalSize
let player = AVPlayer(playerItem: AVPlayerItem(asset: asset))
let videoNode = SKVideoNode(AVPlayer: player)
videoNode.size = size
videoNode.position = CGPoint(x: size.width * 0.5, y: size.height * 0.5)
let videoScene = SKScene(size: size)
videoScene.addChild(videoNode)
let videoWrapperNode = SCNNode(geometry: SCNPlane(width: 10, height: 8))
videoWrapperNode.position = SCNVector3(x: 0, y: 0, z: 0)
videoWrapperNode.geometry?.firstMaterial?.diffuse.contents = videoScene
videoWrapperNode.geometry?.firstMaterial?.doubleSided = true
videoWrapperNode.scale.y = -1
videoWrapperNode.name = "video"
let scene = SCNScene()
scene.rootNode.addChildNode(videoWrapperNode)
return scene
}
Thank you
PS. Help in Objective-C is also welcome :)

Swift video on sphere, black screen but can hear audio

I'm trying to make a 360 degree video player, by projecting a video on a sphere & then turning the normals inside out & placing the camera inside the sphere, however I'm getting a black screen when I try to play the video on the sphere, but I can hear the audio play.
override func viewDidLoad() {
super.viewDidLoad()
//create sphere
makeSphere();
//create user-interface
resizeScreen();
}
ViewDidLoad ^ I call resizeScreen(); which makes the video player 'adapt' to the screen.
//set variables
var player : AVPlayer? = nil;
var playerLayer : AVPlayerLayer? = nil;
var asset : AVAsset? = nil;
var playerItem : AVPlayerItem? = nil;
var timer = NSTimer();
//start functions
func resizeScreen() {
//Get device screen width & height
var screenSize: CGRect = UIScreen.mainScreen().bounds;
//set screen width and height in variables
var screenWidth = screenSize.width;
var screenHeight = screenSize.height;
//change size of video player
playerLayer?.frame.size.height = screenHeight * 0.75;
playerLayer?.frame.size.width = screenWidth * 1.0;
//change size of button
PlayVid.frame.size.height = screenHeight * 0.75;
PlayVid.frame.size.width = screenWidth * 1;
//set position of text label
TemporaryURL.center.x = screenWidth * 0.5;
TemporaryURL.center.y = screenHeight * 0.9
}
resizeScreen();, as mentioned above makes the video player and button adapt to the screen size.
func makeSphere() {
let sceneView = SCNView(frame: self.view.frame);
self.view.addSubview(sceneView);
//Get device screen width & height
var screenSize: CGRect = UIScreen.mainScreen().bounds;
//set screen width and height in variables
var screenWidth = screenSize.width;
var screenHeight = screenSize.height;
//set scene's height, width and x-position
sceneView.frame.size.height = screenHeight * 0.75;
sceneView.frame.size.width = screenWidth * 1;
sceneView.center.x = screenWidth * 0.5;
//create scene
let scene = SCNScene();
sceneView.scene = scene;
//camera positioning
let camera = SCNCamera();
let cameraNode = SCNNode();
cameraNode.camera = camera;
cameraNode.position = SCNVector3(x: -3.0, y: 3.0, z: 3.0);
//lighting
let light = SCNLight();
light.type = SCNLightTypeOmni;
let lightNode = SCNNode();
lightNode.light = light;
lightNode.position = SCNVector3(x: 0, y: 0, z: 0);
//create a sphere
let sphereGeometry = SCNSphere(radius: 2.5);
//create sphere node
let sphereNode = SCNNode(geometry: sphereGeometry);
//create constraint for sphere
let constraint = SCNLookAtConstraint(target: sphereNode);
//constraint.gimbalLockEnabled = true;
cameraNode.constraints = [constraint];
//set nodes
//scene.rootNode.addChildNode(lightNode);
scene.rootNode.addChildNode(cameraNode);
scene.rootNode.addChildNode(sphereNode);
inlineVideo();
player!.play();
//make variable
let videoMaterial = SCNMaterial();
var imgURL = NSURL(string: "http://imgurl");
var videoURLWithPath = "http://videourl";
var videoURL = NSURL(string: videoURLWithPath);
//apply texture to variable
videoMaterial.diffuse.contents = AVPlayerLayer(player: self.player);
videoMaterial.specular.contents = UIColor.redColor();
videoMaterial.shininess = 1.0;
//set texture on object of name sphereGeomtetry
sphereGeometry.materials = [videoMaterial];
}
makeSphere(); this creates the sphere & initialises the video for said Sphere.
func inlineVideo(){
//play inline video
//insert url of video into textfield
var PasteBoard = UIPasteboard.generalPasteboard().string;
TemporaryURL.text = PasteBoard;
//get path of video
//var videoURLWithPath = PasteBoard;
var videoURLWithPath = "http://videourl";
var videoURL = NSURL(string: videoURLWithPath);
if (videoURL == nil){
videoURL = NSURL(string: "http://videourl");
}else{
//do nothing
}
//get video url & set it
asset = AVAsset(URL: videoURL!) as AVAsset;
playerItem = AVPlayerItem(asset:asset!);
//set target for video
player = AVPlayer(playerItem: self.playerItem!);
playerLayer = AVPlayerLayer(player: self.player);
}
This is the video player code I use to initialise the regular video, I tried to bind PlayerLayer to the texture of the sphere, but all I'm getting is a black screen, even though im hearing sound, why isn't the video projecting on the sphere? I think that something like this should work?
Check the following links:
https://github.com/nomtek/spherical_video_player_ios
https://github.com/Aralekk/simple360player_iOS
https://github.com/hanton/HTY360Player
There are 3 different approaches (e.g. OpenGL/SceneKit, shader/fixed-function pipeline, swift/obj-c, RGB/YCbCr) so everyone should find any handy piece of code.
You need to use SKVideoNode to display a video on a sphere, see SKVideoNode only on a small part of SCNSphere - still minor issues, will update when it's fixed.
Set up your player and start playing before you assign it as material to make it work.
var player = AVPlayer(url: yourURL)
player.play()
node.geo.material.emissive.contents = player

How to play 360 video on the iOS device

Looking through different web sites and analyzing different resources I found out that for playing 360 videos on the iPhone you should use 3-d party lib (Panorama). But I'm really interested in how it is possible to do it by your own. Because standard iOS elements does not support such functionality.
Please give some advices about approaches that should i use to create own player for 360 videos.
You can do it using scenekit, without any external libraries or dependencies.
Create a SCNScene and map your camera to the device movements. The cameras are a bit offsetted as so to map one per eye and create a 3D stereoscopic effect.
override func viewDidLoad() {
super.viewDidLoad()
leftSceneView?.backgroundColor = UIColor.blackColor()
rightSceneView?.backgroundColor = UIColor.whiteColor()
// Create Scene
scene = SCNScene()
leftSceneView?.scene = scene
rightSceneView?.scene = scene
// Create cameras
let camX = 0.0 as Float
let camY = 0.0 as Float
let camZ = 0.0 as Float
let zFar = 50.0
let leftCamera = SCNCamera()
let rightCamera = SCNCamera()
leftCamera.zFar = zFar
rightCamera.zFar = zFar
let leftCameraNode = SCNNode()
leftCameraNode.camera = leftCamera
leftCameraNode.position = SCNVector3(x: camX - 0.5, y: camY, z: camZ)
let rightCameraNode = SCNNode()
rightCameraNode.camera = rightCamera
rightCameraNode.position = SCNVector3(x: camX + 0.5, y: camY, z: camZ)
camerasNode = SCNNode()
camerasNode!.position = SCNVector3(x: camX, y:camY, z:camZ)
camerasNode!.addChildNode(leftCameraNode)
camerasNode!.addChildNode(rightCameraNode)
camerasNode!.eulerAngles = SCNVector3Make(degreesToRadians(-90.0), 0, 0)
cameraRollNode = SCNNode()
cameraRollNode!.addChildNode(camerasNode!)
cameraPitchNode = SCNNode()
cameraPitchNode!.addChildNode(cameraRollNode!)
cameraYawNode = SCNNode()
cameraYawNode!.addChildNode(cameraPitchNode!)
scene!.rootNode.addChildNode(cameraYawNode!)
leftSceneView?.pointOfView = leftCameraNode
rightSceneView?.pointOfView = rightCameraNode
// Respond to user head movement. Refreshes the position of the camera 60 times per second.
motionManager = CMMotionManager()
motionManager?.deviceMotionUpdateInterval = 1.0 / 60.0
motionManager?.startDeviceMotionUpdatesUsingReferenceFrame(CMAttitudeReferenceFrame.XArbitraryZVertical)
leftSceneView?.delegate = self
leftSceneView?.playing = true
rightSceneView?.playing = true
}
Update the camera position in the sceneRenderer:
func renderer(aRenderer: SCNSceneRenderer, updateAtTime time: NSTimeInterval){
// Render the scene
dispatch_async(dispatch_get_main_queue()) { () -> Void in
if let mm = self.motionManager, let motion = mm.deviceMotion {
let currentAttitude = motion.attitude
var orientationMultiplier = 1.0
if(UIApplication.sharedApplication().statusBarOrientation == UIInterfaceOrientation.LandscapeRight){ orientationMultiplier = -1.0}
self.cameraRollNode!.eulerAngles.x = Float(currentAttitude.roll * orientationMultiplier)
self.cameraPitchNode!.eulerAngles.z = Float(currentAttitude.pitch)
self.cameraYawNode!.eulerAngles.y = Float(currentAttitude.yaw)
}
}
}
Here is some code to add a SCNSphere displaying an AVPlayer.
func play(){
//let fileURL: NSURL? = NSURL(string: "http://www.kolor.com/360-videos-files/noa-neal-graffiti-360-music-video-full-hd.mp4")
let fileURL: NSURL? = NSURL.fileURLWithPath(NSBundle.mainBundle().pathForResource("vr", ofType: "mp4")!)
if (fileURL != nil){
videoSpriteKitNode = SKVideoNode(AVPlayer: AVPlayer(URL: fileURL!))
videoNode = SCNNode()
videoNode!.geometry = SCNSphere(radius: 30)
let spriteKitScene = SKScene(size: CGSize(width: 2500, height: 2500))
spriteKitScene.scaleMode = .AspectFit
videoSpriteKitNode!.position = CGPoint(x: spriteKitScene.size.width / 2.0, y: spriteKitScene.size.height / 2.0)
videoSpriteKitNode!.size = spriteKitScene.size
spriteKitScene.addChild(videoSpriteKitNode!)
videoNode!.geometry?.firstMaterial?.diffuse.contents = spriteKitScene
videoNode!.geometry?.firstMaterial?.doubleSided = true
// Flip video upside down, so that it's shown in the right position
var transform = SCNMatrix4MakeRotation(Float(M_PI), 0.0, 0.0, 1.0)
transform = SCNMatrix4Translate(transform, 1.0, 1.0, 0.0)
videoNode!.pivot = SCNMatrix4MakeRotation(Float(M_PI_2), 0.0, -1.0, 0.0)
videoNode!.geometry?.firstMaterial?.diffuse.contentsTransform = transform
videoNode!.position = SCNVector3(x: 0, y: 0, z: 0)
scene!.rootNode.addChildNode(videoNode!)
videoSpriteKitNode!.play()
playingVideo = true
}
}
I've put together a project on github to show how, with instructions that should be clear !
Works in VR too with a google cardboard.
https://github.com/Aralekk/simple360player_iOS
Apart from Aralekk's repo, I've found hanton's repo useful when creating my own video 360° player.
Here is the link to my repo and here is related blogpost.
You can use this fame work for 360 degree player pod 'NYT360Video' in Objective C and Swift their is a
For Objective default Example is provided
For Swift use the same Frame work
/// For 360 degree view
// just import the frame work
import NYT360Video
// declare the nyt360VC global fro the view controller
var nyt360VC: NYT360ViewController!
/// Player implementation
let videoURL = Bundle.main.url(forResource: "360Video", withExtension: "mp4")!
let player = AVPlayer(url: videoURL)
let playerLayer = AVPlayerLayer(player: player)
playerLayer.frame = self.view.bounds
self.view.layer.addSublayer(playerLayer)
player.play()
/// 360 Degree VC implementation along with manager
let manager: NYT360MotionManagement = NYT360MotionManager.shared()
self.nyt360VC = NYT360ViewController.init(avPlayer: player2,
motionManager: manager)
// Adding the 360 degree view controller to view
self.addChildViewController(nyt360VC)
self.view.addSubview(self.nyt360VC.view)
self.nyt360VC.didMove(toParentViewController: self)
I implemented a similar solution using SceneKit for iOS here: ThreeSixtyPlayer.
In my opinion this code is a bit simpler and more scalable than other solutions out there. It is however just the basics (no stereoscopic playback, only supports sphere geometry, doesn't yet support cardboard, etc.).

Resources