I'm trying to show a video in augmented reality using Vuforia - but for the sake of this question, just showing the scene and video would be fine.
What's expected:
Show the video (playing) at the correct speed for video and audio and have them both in sync.
What's happening:
Audio plays at correct speed. Video plays at a seriously fast speed - like 10x.
Tried:
I've tried changing the rate - it's ignored completely.
I've tried using different ways (AVPlayer, AVPlayerLayer,
SKVideoNode(withURL)) of putting the video into the scene - all
suffer from hyperactive-video-syndrome
I've tried other file formats - nope
I've tried local files and URL - no dice
I've tried throwing my laptop at a wall - it made the video go away
Code to return a the scene with the video:
private func createVideoScene(with view: VuforiaEAGLView) -> SCNScene {
// create the asset & player and grab the dimensions
let asset = AVAsset(URL: NSURL(string: "https://inm-baobab-prod-eu-west-1.s3.amazonaws.com/public/inm/media/video/2016/09/02/61537094SansSouciGirlsSchool.mp4")!)
let size = asset.tracksWithMediaType(AVMediaTypeVideo)[0].naturalSize
let player = AVPlayer(playerItem: AVPlayerItem(asset: asset))
let videoNode = SKVideoNode(AVPlayer: player)
videoNode.size = size
videoNode.position = CGPoint(x: size.width * 0.5, y: size.height * 0.5)
let videoScene = SKScene(size: size)
videoScene.addChild(videoNode)
let videoWrapperNode = SCNNode(geometry: SCNPlane(width: 10, height: 8))
videoWrapperNode.position = SCNVector3(x: 0, y: 0, z: 0)
videoWrapperNode.geometry?.firstMaterial?.diffuse.contents = videoScene
videoWrapperNode.geometry?.firstMaterial?.doubleSided = true
videoWrapperNode.scale.y = -1
videoWrapperNode.name = "video"
let scene = SCNScene()
scene.rootNode.addChildNode(videoWrapperNode)
return scene
}
Thank you
PS. Help in Objective-C is also welcome :)
Related
I want to play a youtube video on my image tracking module
like when image tracked then play youtube video on it.
Can anyone help me?
private func makeVideo(with url: URL, size: CGSize) -> SCNNode? {
DispatchQueue.main.async {
self.player.frame = CGRect(x: 0, y: 0, width: 650, height: 400)
self.player.autoplay = true
self.player.loadPlayer()
}
// 4
let avMaterial = SCNMaterial()
avMaterial.diffuse.contents = player
// 5
let videoPlane = SCNPlane(width: size.width, height: size.height)
videoPlane.materials = [avMaterial]
// 6
let videoNode = SCNNode(geometry: videoPlane)
videoNode.eulerAngles.x = -.pi / 2
return videoNode
}
import YoutubeKit
let player = YTSwiftyPlayer(playerVars: [.videoID("GJQsT-h0FTU")])
I try this but it shows a black screen on image but audio is play
this may be not exactly solution to your issue, but I hope it can help you in some ways
So, what you can do in addition to the Image Detection is to display the UIWebView
'UIWebView' was deprecated in iOS 12.0: No longer supported; please adopt WKWebView.
Even though the Xcode will warn you with something like the message above, but the WKWebView does not display a thing, so you will need to stick with WebView until they fix the issue
So, assuming that you have figured out the image detection part and you are able to locate and put nodes over detected image you can implement following function:
func displayWebSite(on rootNode: SCNNode, horizontalOffset: CGFloat) {
DispatchQueue.main.async {
// Open YouTube
let request = URLRequest(url: URL(string: "https://youtu.be/7ehEPsrw1X8")!)
// Define the size
let webView = UIWebView(frame: CGRect(x: 0, y: 0, width: 650, height: 900))
webView.loadRequest(request)
// Set size
let webViewPlane = SCNPlane(width: horizontalOffset, height: horizontalOffset * 1.45)
webViewPlane.cornerRadius = 0.025
// Define geometry
let webViewNode = SCNNode(geometry: webViewPlane)
// Set the WebView as a material to the plane
webViewNode.geometry?.firstMaterial?.diffuse.contents = webView
webViewNode.opacity = 0
// Put a little in front to avoid merger with detected image
webViewNode.position.z += 0.04
// Add the node
rootNode.addChildNode(webViewNode)
}
}
For your reference you can check one of my projects to see how that function works within a project
So I have following code to create custom wall
let wall = SCNPlane(width: CGFloat(distance),
height: CGFloat(height))
wall.firstMaterial = wallMaterial()
let node = SCNNode(geometry: wall)
// always render before the beachballs
node.renderingOrder = -10
// get center point
node.position = SCNVector3(from.x + (to.x - from.x) * 0.5,
from.y + height * 0.5,
from.z + (to.z - from.z) * 0.5)
node.eulerAngles = SCNVector3(0,
-atan2(to.x - node.position.x, from.z - node.position.z) - Float.pi * 0.5,
0)
And Now I am adding simple SCNPlane on hit test and add video (skscene to it)
// first.node is hittest result
let node = SCNNode(geometry: SCNPlane(width: CGFloat(width) , height: CGFloat(height))
node.geometry?.firstMaterial?.isDoubleSided = true
node.geometry?.firstMaterial?.diffuse.contents = self.create2DVideoScene(xScale: first.node.eulerAngles.y < 0 ? -1 : nil)
node.position = nodesWithDistance.previous.node.mainNode.position
node.eulerAngles = first.node.eulerAngles
Here How I created 2d node
/// Creates 2D video scene
private func create2DVideoScene (xScale:CGFloat?) -> SKScene {
var videoPlayer = AVPlayer()
if let validURL = Bundle.main.url(forResource: "video", withExtension: "mp4", subdirectory: "/art.scnassets") {
let item = AVPlayerItem(url: validURL)
videoPlayer = AVPlayer(playerItem: item)
}
let videoNode = SKVideoNode(avPlayer: videoPlayer)
videoNode.yScale *= -1
// While debug I observe that if first.node.rotation.y in - , then we need to change xScale to -1 (when wall draw from right -> left )
if let xScale = xScale {
videoNode.xScale *= xScale
}
videoNode.play()
let skScene = SKScene(size: self.sceneView.frame.size)
skScene.scaleMode = .aspectFill
skScene.backgroundColor = .green
skScene.addChild(videoNode)
videoNode.position = CGPoint(x: skScene.size.width/2, y: skScene.size.height/2)
videoNode.size = skScene.size
return skScene
}
Issue :: If I draw wall node from left to right means first point is on left side and other point on right side and draw wall between them. Then Video is flipped.
If I draw from right to left means first point is on right side and second point is on left side and draw line between them then video is perfectly fine.
To fix this I checked wall eulerAngles check the line self.create2DVideoScene but this is not working in every area of real world
I want video should not be start flipped in front of user
EDIT
video is flipped because of eulerAngles is different in both case while create a wall
Angle point1 to point2 --> (0.000000 -1.000000 0.000000 3.735537)
Angle point2 to point1 -- > (0.000000 -1.000000 0.000000 0.478615)
Issue video Click here to play video
Please please provide the suggestion or solution of this issue .
Video flipped
I have fixed issue with temporary solution still looking for a better one
Issue is of wall. wall is flipped other side of camera when draw from right to left direction. I have figure it out by setting isDoubleSided = false and by applying a image as diffuse contents which has text and I can see that image is flipped itself.
I have tried many things but this one helps me
Normalize vector
Find the cross between to SCNVectors
If y > 0 I just swapped from and to Value
Code
let normalizedTO = to.normalized()
let normalizedFrom = from.normalized()
let angleBetweenTwoVectors = normalizedTO.cross(normalizedFrom)
var from = from
var to = to
if angleBetweenTwoVectors.y > 0 {
let temp = from
from = to
to = temp
}
// Inside extension of SCNVector3
func normalized() -> SCNVector3 {
if self.length() == 0 {
return self
}
return self / self.length()
}
func cross(_ vec: SCNVector3) -> SCNVector3 {
return SCNVector3(self.y * vec.z - self.z * vec.y, self.z * vec.x - self.x * vec.z, self.x * vec.y - self.y * vec.x)
}
Hope it is helpful. If anyone know better solution please answer it.
The following code shows my video file in correct zPosition with the other elements I'm working with, creating a background video.
The problem I'm having is that the vertical video (1080x1920 pixels) gets rotated 90 degrees counterclockwise, and is stretched to fit as a landscape video. How can I ensure correct orientation without sacrificing my need to use the SKVideoNode with zPosition?
let videoNode: SKVideoNode? = {
guard let urlString = Bundle.main.path(forResource: "merry", ofType: "mov") else {
return nil
}
let url = URL(fileURLWithPath: urlString)
let item = AVPlayerItem(url: url)
player = AVPlayer(playerItem: item)
return SKVideoNode(avPlayer: player)
}()
videoNode?.position = CGPoint( x: frame.midX, y: frame.midY)
videoNode?.size = self.frame.size
videoNode?.zPosition = 20
addChild((videoNode)!)
player.play()
player.volume = 0
Many thanks!
Got there in the end with a workaround:
// fix to rotate vertical video by 90 degrees and resize to fit....
videoNode?.zRotation = CGFloat(-Double.pi) * 90 / 180
videoNode?.size.width = self.frame.size.height
videoNode?.size.height = self.frame.size.width
I'm attempting to map a video as texture to a primitive cylinder for a VR project by using Scenekit: an SKVideoNode embedded in an SKScene as a texture for a SceneKit SCNTube object, and I just can't get video to display as a still image would. PLayground code below should generate moving video mapped to cylinder, but the mapping does not work:
EDIT: ADDED SINGLE LINE AT END OF LISTING TO FIX. CODE BELOW SHOULD WORK
import UIKit
import SceneKit // for 3D mapping
import SpriteKit // for SKVideoNode
import QuartzCore // for basic animation
import XCPlayground // for live preview
import AVFoundation // for video playback engine
// create a scene view with an empty scene
var sceneView = SCNView(frame: CGRect(x: 0, y: 0, width: 300, height: 300))
var scene = SCNScene()
sceneView.scene = scene
// start a live preview of that view
XCPShowView("The Scene View", view: sceneView)
// default lighting
sceneView.autoenablesDefaultLighting = true
// a geometry object
var tube = SCNTube(innerRadius: 1.99, outerRadius: 2, height: 3)
var tubeNode = SCNNode(geometry: tube)
scene.rootNode.addChildNode(tubeNode)
// video scene
let urlStr = NSBundle.mainBundle().pathForResource("sample", ofType: "mp4")
let url = NSURL(fileURLWithPath: urlStr!)
let asset = AVURLAsset(URL: url, options: nil)
let playerItem = AVPlayerItem(asset: asset)
let player = AVPlayer(playerItem: playerItem)
let videoNode = SKVideoNode(AVPlayer: player)
let spritescene = SKScene(size: CGSize(width: 1211, height: 431))
videoNode.size.width=spritescene.size.width
videoNode.size.height=spritescene.size.height
spritescene.addChild(videoNode)
// configure the geometry object
var myImage = UIImage.init(named: "BandImage.jpeg")
tube.firstMaterial?.diffuse.contents = spritescene
// set a rotation axis (no angle) to be able to
// use a nicer keypath below and avoid needing
// to wrap it in an NSValue
tubeNode.rotation = SCNVector4(x: 0.0, y: 1.0, z: 0.0, w: 0.0)
// animate the rotation of the torus
var spin = CABasicAnimation(keyPath: "rotation.w") // only animate the angle
spin.toValue = 2.0*M_PI
spin.duration = 3
spin.repeatCount = HUGE // for infinity
tubeNode.addAnimation(spin, forKey: "spin around")
// starts the video, solving the issue
sceneView.playing = true
I've posted my code (and some sample panoramic content) on github for anyone who wants to have a working sample code or is interested in collaborating on an opensource panoramic video player:
https://github.com/jglasse/OSVR
As it turns out, it looks like simulator (and playgrounds) doesn't support this feature. Moving the above code to a project and running on a device, I finally have it working.
So the moral of the story is - if you're using SKVideoNodes as textures for Scenekit, use an actual device for testing.
Looking through different web sites and analyzing different resources I found out that for playing 360 videos on the iPhone you should use 3-d party lib (Panorama). But I'm really interested in how it is possible to do it by your own. Because standard iOS elements does not support such functionality.
Please give some advices about approaches that should i use to create own player for 360 videos.
You can do it using scenekit, without any external libraries or dependencies.
Create a SCNScene and map your camera to the device movements. The cameras are a bit offsetted as so to map one per eye and create a 3D stereoscopic effect.
override func viewDidLoad() {
super.viewDidLoad()
leftSceneView?.backgroundColor = UIColor.blackColor()
rightSceneView?.backgroundColor = UIColor.whiteColor()
// Create Scene
scene = SCNScene()
leftSceneView?.scene = scene
rightSceneView?.scene = scene
// Create cameras
let camX = 0.0 as Float
let camY = 0.0 as Float
let camZ = 0.0 as Float
let zFar = 50.0
let leftCamera = SCNCamera()
let rightCamera = SCNCamera()
leftCamera.zFar = zFar
rightCamera.zFar = zFar
let leftCameraNode = SCNNode()
leftCameraNode.camera = leftCamera
leftCameraNode.position = SCNVector3(x: camX - 0.5, y: camY, z: camZ)
let rightCameraNode = SCNNode()
rightCameraNode.camera = rightCamera
rightCameraNode.position = SCNVector3(x: camX + 0.5, y: camY, z: camZ)
camerasNode = SCNNode()
camerasNode!.position = SCNVector3(x: camX, y:camY, z:camZ)
camerasNode!.addChildNode(leftCameraNode)
camerasNode!.addChildNode(rightCameraNode)
camerasNode!.eulerAngles = SCNVector3Make(degreesToRadians(-90.0), 0, 0)
cameraRollNode = SCNNode()
cameraRollNode!.addChildNode(camerasNode!)
cameraPitchNode = SCNNode()
cameraPitchNode!.addChildNode(cameraRollNode!)
cameraYawNode = SCNNode()
cameraYawNode!.addChildNode(cameraPitchNode!)
scene!.rootNode.addChildNode(cameraYawNode!)
leftSceneView?.pointOfView = leftCameraNode
rightSceneView?.pointOfView = rightCameraNode
// Respond to user head movement. Refreshes the position of the camera 60 times per second.
motionManager = CMMotionManager()
motionManager?.deviceMotionUpdateInterval = 1.0 / 60.0
motionManager?.startDeviceMotionUpdatesUsingReferenceFrame(CMAttitudeReferenceFrame.XArbitraryZVertical)
leftSceneView?.delegate = self
leftSceneView?.playing = true
rightSceneView?.playing = true
}
Update the camera position in the sceneRenderer:
func renderer(aRenderer: SCNSceneRenderer, updateAtTime time: NSTimeInterval){
// Render the scene
dispatch_async(dispatch_get_main_queue()) { () -> Void in
if let mm = self.motionManager, let motion = mm.deviceMotion {
let currentAttitude = motion.attitude
var orientationMultiplier = 1.0
if(UIApplication.sharedApplication().statusBarOrientation == UIInterfaceOrientation.LandscapeRight){ orientationMultiplier = -1.0}
self.cameraRollNode!.eulerAngles.x = Float(currentAttitude.roll * orientationMultiplier)
self.cameraPitchNode!.eulerAngles.z = Float(currentAttitude.pitch)
self.cameraYawNode!.eulerAngles.y = Float(currentAttitude.yaw)
}
}
}
Here is some code to add a SCNSphere displaying an AVPlayer.
func play(){
//let fileURL: NSURL? = NSURL(string: "http://www.kolor.com/360-videos-files/noa-neal-graffiti-360-music-video-full-hd.mp4")
let fileURL: NSURL? = NSURL.fileURLWithPath(NSBundle.mainBundle().pathForResource("vr", ofType: "mp4")!)
if (fileURL != nil){
videoSpriteKitNode = SKVideoNode(AVPlayer: AVPlayer(URL: fileURL!))
videoNode = SCNNode()
videoNode!.geometry = SCNSphere(radius: 30)
let spriteKitScene = SKScene(size: CGSize(width: 2500, height: 2500))
spriteKitScene.scaleMode = .AspectFit
videoSpriteKitNode!.position = CGPoint(x: spriteKitScene.size.width / 2.0, y: spriteKitScene.size.height / 2.0)
videoSpriteKitNode!.size = spriteKitScene.size
spriteKitScene.addChild(videoSpriteKitNode!)
videoNode!.geometry?.firstMaterial?.diffuse.contents = spriteKitScene
videoNode!.geometry?.firstMaterial?.doubleSided = true
// Flip video upside down, so that it's shown in the right position
var transform = SCNMatrix4MakeRotation(Float(M_PI), 0.0, 0.0, 1.0)
transform = SCNMatrix4Translate(transform, 1.0, 1.0, 0.0)
videoNode!.pivot = SCNMatrix4MakeRotation(Float(M_PI_2), 0.0, -1.0, 0.0)
videoNode!.geometry?.firstMaterial?.diffuse.contentsTransform = transform
videoNode!.position = SCNVector3(x: 0, y: 0, z: 0)
scene!.rootNode.addChildNode(videoNode!)
videoSpriteKitNode!.play()
playingVideo = true
}
}
I've put together a project on github to show how, with instructions that should be clear !
Works in VR too with a google cardboard.
https://github.com/Aralekk/simple360player_iOS
Apart from Aralekk's repo, I've found hanton's repo useful when creating my own video 360° player.
Here is the link to my repo and here is related blogpost.
You can use this fame work for 360 degree player pod 'NYT360Video' in Objective C and Swift their is a
For Objective default Example is provided
For Swift use the same Frame work
/// For 360 degree view
// just import the frame work
import NYT360Video
// declare the nyt360VC global fro the view controller
var nyt360VC: NYT360ViewController!
/// Player implementation
let videoURL = Bundle.main.url(forResource: "360Video", withExtension: "mp4")!
let player = AVPlayer(url: videoURL)
let playerLayer = AVPlayerLayer(player: player)
playerLayer.frame = self.view.bounds
self.view.layer.addSublayer(playerLayer)
player.play()
/// 360 Degree VC implementation along with manager
let manager: NYT360MotionManagement = NYT360MotionManager.shared()
self.nyt360VC = NYT360ViewController.init(avPlayer: player2,
motionManager: manager)
// Adding the 360 degree view controller to view
self.addChildViewController(nyt360VC)
self.view.addSubview(self.nyt360VC.view)
self.nyt360VC.didMove(toParentViewController: self)
I implemented a similar solution using SceneKit for iOS here: ThreeSixtyPlayer.
In my opinion this code is a bit simpler and more scalable than other solutions out there. It is however just the basics (no stereoscopic playback, only supports sphere geometry, doesn't yet support cardboard, etc.).