I'm playing around with ARKit a little and wanted to place a video inside of a SCNPlane. My issue is, that I don't see the players content at all. The controls with the correct durations etc are there, so the player seems to be set up correctly. Still, it is not showing up.
All the SO-Answers I found were solved by giving the playerLayer the correct size, but this is most certainly not my issue, because if I give the playerLayer a background color, it is shown (and in front of everything else, like it should be)
let videoPlayerController = AVVideoPlayerController()
private func handleFoundMagazine(_ imageAnchor: ARImageAnchor, _ node: SCNNode) {
let size = imageAnchor.referenceImage.physicalSize
if let videoNode = createVideoNode(size: size) {
videoNode.name = "Magazine Video"
node.addChildNode(videoNode)
node.opacity = 1
}
}
private func createVideoNode(size: CGSize) -> SCNNode? {
guard let videoPath = Bundle.main.url(forResource: "Video_1_Alle", withExtension: "mp4") else { print("not found in ressource"); return nil }
let videoItem = AVPlayerItem(url: videoPath)
let videoPlayer = AVPlayer(playerItem: videoItem)
videoPlayerController.player = videoPlayer
let avMaterial = SCNMaterial()
avMaterial.diffuse.contents = videoPlayerController.view
let videoPlane = SCNPlane(width: size.width, height: size.height)
videoPlane.materials = [avMaterial]
let videoNode = SCNNode(geometry: videoPlane)
videoNode.eulerAngles.x = -.pi / 2
print("successfully created Video Node")
return videoNode
}
import UIKit
import AVKit
class AVVideoPlayerController: AVPlayerViewController {
var playerLayer = AVPlayerLayer()
var isPlaying = false
override func viewDidLoad() {
super.viewDidLoad()
self.view.backgroundColor = .red
}
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
setupPlayerLayer()
player?.play()
}
fileprivate func setupPlayerLayer() {
self.playerLayer = AVPlayerLayer(player: self.player)
self.view.layer.addSublayer(playerLayer)
playerLayer.frame = self.view.frame
//playerLayer.backgroundColor = .blue
}
}
Doesn't look too complex, right?
BTW: Once it lagged a little and then I was able to see the video, BEHIND the red Background of the viewcontrollers view.
Picture:
Related
I'm playing around with ARKit and have created an app which overlays video content onto recognized images. Code is as follows:
import UIKit
import SceneKit
import ARKit
import SpriteKit
class ViewController: UIViewController, ARSCNViewDelegate {
#IBOutlet var sceneView: ARSCNView!
override func viewDidLoad() {
super.viewDidLoad()
sceneView.delegate = self
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
let configuration = ARImageTrackingConfiguration()
if let imageToTrack = ARReferenceImage.referenceImages(inGroupNamed: "ALLimages", bundle: Bundle.main) {
configuration.trackingImages = imageToTrack
configuration.maximumNumberOfTrackedImages = 3
print("Images successfully added")
}
sceneView.session.run(configuration)
}
override func viewWillDisappear(_ animated: Bool) {
super.viewWillDisappear(animated)
sceneView.session.pause()
}
// MARK: - ARSCNViewDelegate
func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? {
let node = SCNNode()
var videoNode = SKVideoNode()
var videoScene = SKScene()
if let imageAnchor = anchor as? ARImageAnchor {
videoNode = SKVideoNode(fileNamed: "video1.mp4")
videoNode.play()
videoScene = SKScene(size: CGSize(width: 640, height: 360))
videoNode.position = CGPoint(x: videoScene.size.width / 2, y: videoScene.size.height / 2)
videoNode.yScale = -1.0
videoScene.addChild(videoNode)
let plane = SCNPlane(width: imageAnchor.referenceImage.physicalSize.width, height: imageAnchor.referenceImage.physicalSize.height)
plane.firstMaterial?.diffuse.contents = videoScene
let planeNode = SCNNode(geometry: plane)
planeNode.eulerAngles.x = -.pi/2
node.addChildNode(planeNode)
}
return node
}
}
This works perfectly... once! But the problem is that once the video has finished playing, because of the limited controls available via SKVideoNode I cannot figure out how to restart the video automatically. Ideally these should play on a loop.
I've done some research and it seems that the best way is to initialize my video node using an AVPlayer object.
So, I attempted to do this but cannot get it working.
I added var player = AVPlayer() in my class and then tried to initialize my videoNode as follows:
var videoNode: SKVideoNode? = {
guard let urlString = Bundle.main.path(forResource: "video1", ofType: "mov") else {
return nil
}
let url = URL(fileURLWithPath: urlString)
let item = AVPlayerItem(url: url)
let player = AVPlayer(playerItem: item)
return SKVideoNode(avPlayer: player)
}()
I then attempted to user player.play() but the video never plays. Instead my plane just appears as a blank rectangle over my images.
Once I successfully initialize this I think I'm able to add an observer to check when the video has finished player and restart it, but I'm struggling to get to that point.
First, you don't need SKVideoNode in a SKScene in a SCNNode. You can directly use AVPlayer as diffuse content of you SCNNode :
plane.firstMaterial?.diffuse.contents = player
For looping you have to subscribe to a notification on the player and reset time to zero at end :
NotificationCenter.default.addObserver(forName: .AVPlayerItemDidPlayToEndTime, object: player.currentItem, queue: nil) { _ in
player.seek(to: .zero)
player.play()
}
I have an ARKit that recognises two different tracking images
Vuforia and Tracker
It works when I have a different .scn for each tracker, and it hops between the two.
I am trying to get the Tracker image to overlay a video player,
I have tried everything I can think of but am now stuck.
I have checked all the file names and links, but think I must be missing something to get the video player to fire up.
Help!
import UIKit
import SceneKit
import ARKit
class ThirdViewController: UIViewController, ARSCNViewDelegate {
#IBOutlet var sceneView: ARSCNView!
let exampleVideoPlayer: AVPlayer = {
//load example video from bundle
guard let url = Bundle.main.url(forResource: "homer video", withExtension: "mov", subdirectory: "AR.scnassets") else {
print("Could not find video file")
return AVPlayer()
}
return AVPlayer(url: url)
}()
var FreemensNode: SCNNode?
var VideoNode: SCNNode?
override func viewDidLoad() {
super.viewDidLoad()
// Set the view's delegate
sceneView.delegate = self
sceneView.autoenablesDefaultLighting = true
let FreemensScene = SCNScene(named: "AR.scnassets/Freemens.scn")
let VideoScene = SCNScene(named: "AR.scnassets/Video.scn")
FreemensNode = FreemensScene?.rootNode
VideoNode = VideoScene?.rootNode
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
let configuration = ARImageTrackingConfiguration()
if let trackingImages = ARReferenceImage.referenceImages(inGroupNamed: "Photos", bundle: Bundle.main){
configuration.trackingImages = trackingImages
configuration.maximumNumberOfTrackedImages = 2
}
// Run the view's session
sceneView.session.run(configuration)
}
override func viewWillDisappear(_ animated: Bool) {
super.viewWillDisappear(animated)
sceneView.session.pause()
}
// MARK: - ARSCNViewDelegate
public func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? {
let node = SCNNode()
if let imageAnchor = anchor as? ARImageAnchor {
// Create a plane
let plane = SCNPlane(width: imageAnchor.referenceImage.physicalSize.width, height: imageAnchor.referenceImage.physicalSize.height)
if imageAnchor.referenceImage.name == "Tracker" {
// Set AVPlayer as the plane's texture and play
plane.firstMaterial?.diffuse.contents = self.exampleVideoPlayer
self.exampleVideoPlayer.play()
} else if imageAnchor.referenceImage.name == "Vuforia" {
plane.firstMaterial?.diffuse.contents = UIColor(white: 1, alpha: 0.0)
let planeNode = SCNNode(geometry: plane)
planeNode.eulerAngles.x = -.pi / 2
node.addChildNode(planeNode)
}
var shapeNode: SCNNode?
switch imageAnchor.referenceImage.name {
case ArItem.Freemens.rawValue :
shapeNode = FreemensNode
case ArItem.Video.rawValue :
shapeNode = VideoNode
default:
break
}
if imageAnchor.referenceImage.name == "Vuforia" {
shapeNode = FreemensNode
} else if imageAnchor.referenceImage.name == "Tracker"{
shapeNode = VideoNode
}
guard let shape = shapeNode else { return nil}
node.addChildNode(shape)
}
return node
}
enum ArItem : String {
case Freemens = "Freemens"
case Video = "Video"
}
}
full page code here
import UIKit
import SceneKit
import ARKit
class ThirdViewController: UIViewController, ARSCNViewDelegate {
#IBOutlet var sceneView: ARSCNView!
var FreemensNode: SCNNode?
var VideoNode: SCNNode?
override func viewDidLoad() {
super.viewDidLoad()
// Set the view's delegate
sceneView.delegate = self
sceneView.autoenablesDefaultLighting = true
let FreemensScene = SCNScene(named: "ar.scnassets/Freemens.scn")
let VideoScene = SCNScene(named: "ar.scnassets/Video.scn")
FreemensNode = FreemensScene?.rootNode
VideoNode = VideoScene?.rootNode
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
let configuration = ARImageTrackingConfiguration()
if let trackingImages = ARReferenceImage.referenceImages(inGroupNamed: "Photos", bundle: Bundle.main){
configuration.trackingImages = trackingImages
configuration.maximumNumberOfTrackedImages = 2
}
// Run the view's session
sceneView.session.run(configuration)
}
override func viewWillDisappear(_ animated: Bool) {
super.viewWillDisappear(animated)
sceneView.session.pause()
}
// MARK: - ARSCNViewDelegate
func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? {
let node = SCNNode()
if let imageAnchor = anchor as? ARImageAnchor {
if let fileString = Bundle.main.path(forResource: "black", ofType: "mp4") {
let videoItem = AVPlayerItem(url: URL(fileURLWithPath: fileString))
let player = AVPlayer(playerItem: videoItem)
//initialize video node with avplayer
let videoNode = SKVideoNode(avPlayer: player)
player.play()
// add observer when our player.currentItem finishes player, then start playing from the beginning
NotificationCenter.default.addObserver(forName: .AVPlayerItemDidPlayToEndTime, object: player.currentItem, queue: nil) { (notification) in
player.seek(to: CMTime.zero)
player.play()
print("Looping Video")
}
// set the size (just a rough one will do)
let videoScene = SKScene(size: CGSize(width: 480, height: 360))
// center our video to the size of our video scene
videoNode.position = CGPoint(x: videoScene.size.width / 2, y: videoScene.size.height / 2)
// invert our video so it does not look upside down
videoNode.yScale = -1.0
// add the video to our scene
videoScene.addChild(videoNode)
// Create a plane
let plane = SCNPlane(width: imageAnchor.referenceImage.physicalSize.width, height: imageAnchor.referenceImage.physicalSize.height)
if imageAnchor.referenceImage.name == "Tracker" {
// Set AVPlayer as the plane's texture and play
plane.firstMaterial?.diffuse.contents = videoScene
} else if imageAnchor.referenceImage.name == "Vuforia" {
plane.firstMaterial?.diffuse.contents = UIColor(white: 1, alpha: 0.0)
let planeNode = SCNNode(geometry: plane)
planeNode.eulerAngles.x = -.pi / 2
node.addChildNode(planeNode)
}
var shapeNode: SCNNode?
switch imageAnchor.referenceImage.name {
case ArItem.Freemens.rawValue :
shapeNode = FreemensNode
case ArItem.Video.rawValue :
shapeNode = VideoNode
default:
break
}
if imageAnchor.referenceImage.name == "Vuforia" {
shapeNode = FreemensNode
} else if imageAnchor.referenceImage.name == "Tracker"{
shapeNode = VideoNode
}
guard let shape = shapeNode else { return nil}
node.addChildNode(shape)
}
return node
}
enum ArItem : String {
case Freemens = "Freemens"
case Video = "Video"
}
}
}
I've had success with this code
// setup AV player & create SKVideoNode from avPlayer
let videoURL = URL(fileURLWithPath: Bundle.main.path(forResource: videoAssetName, ofType: videoAssetExtension)!)
let player = AVPlayer(url: videoURL)
player.actionAtItemEnd = .none
videoPlayerNode = SKVideoNode(avPlayer: player)
// setup player
let skSceneSize = orientation == .horizontal ? CGSize(width: 1280, height: 720) : CGSize(width: 406, height: 720)
let skScene = SKScene(size: skSceneSize)
skScene.addChild(videoPlayerNode)
videoPlayerNode.position = CGPoint(x: skScene.size.width/2, y: skScene.size.height/2)
videoPlayerNode.size = skScene.size
let scnPlaneSize : [String : CGFloat] = orientation == .horizontal ? ["width": 0.9, "height": 0.5063] : ["width": 0.5063, "height": 0.9]
let videoPlane = SCNPlane(width: scnPlaneSize["width"]!, height: scnPlaneSize["height"]!)
videoPlane.firstMaterial?.diffuse.contents = skScene
videoPlane.firstMaterial?.isDoubleSided = true
let videoPlaneNode = SCNNode(geometry: videoPlane)
node.addChildNode(videoPlaneNode)
// setup node to auto remove itself upon completion
NotificationCenter.default.addObserver(forName: .AVPlayerItemDidPlayToEndTime, object: player.currentItem, queue: nil, using: { (_) in
DispatchQueue.main.async {
if self.debug { NSLog("video completed") }
// do something when the video ends
}
})
// play the video node
videoPlayerNode.play()
I use this simple code for playing 360 video. I need to add a point to the video - there is no problem with that. But how to track clicks on it? In this example, adding a point occurs in the viewDidLoad method.
I tried touchesBegan, but this method does not work. I really hope for your help
class ViewControllerTwo: UIViewController {
let motionManager = CMMotionManager()
let cameraNode = SCNNode()
var sphereNode: SCNNode!
#IBOutlet weak var sceneView: SCNView!
func createSphereNode(material: AnyObject?) -> SCNNode {
let sphere = SCNSphere(radius: 100.0)
sphere.segmentCount = 96
sphere.firstMaterial!.isDoubleSided = true
sphere.firstMaterial!.diffuse.contents = material
let sphereNode = SCNNode(geometry: sphere)
sphereNode.position = SCNVector3Make(0,0,0)
return sphereNode
}
func configureScene(node sphereNode: SCNNode) {
let scene = SCNScene()
sceneView.scene = scene
sceneView.showsStatistics = true
sceneView.allowsCameraControl = true
cameraNode.camera = SCNCamera()
cameraNode.position = SCNVector3Make(0, 0, 0)
scene.rootNode.addChildNode(sphereNode)
scene.rootNode.addChildNode(cameraNode)
}
func startCameraTracking() {
motionManager.deviceMotionUpdateInterval = 1.0 / 60.0
motionManager.startDeviceMotionUpdates(to: .main) { [weak self] (data, error) in
guard let data = data else { return }
let attitude: CMAttitude = data.attitude
self?.cameraNode.eulerAngles = SCNVector3Make(Float(attitude.roll + Double.pi/2.0), -Float(attitude.yaw), -Float(attitude.pitch))
}
}
override func viewDidLoad() {
super.viewDidLoad()
let url = URL(fileURLWithPath: Bundle.main.path(forResource: "google-help-vr", ofType: "mp4")!)
let player = AVPlayer(url: url )
let videoNode = SKVideoNode(avPlayer: player)
let size = CGSize(width: 1025, height: 512)
videoNode.size = size
videoNode.position = CGPoint(x: size.width / 2, y: size.height / 2)
let spriteScene = SKScene(size: size)
spriteScene.addChild(videoNode)
// How to detect when tapped?
let circ = SKShapeNode(rectOf: CGSize(width: 50, height: 50), cornerRadius: 25)
circ.fillColor = .red
circ.isUserInteractionEnabled = true
videoNode.addChild(circ)
sphereNode = createSphereNode(material:spriteScene)
configureScene(node: sphereNode)
guard motionManager.isDeviceMotionAvailable else {
return
}
startCameraTracking()
player.play()
}
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
sceneView.play(self)
}
}
I did a self class for the object SKShapeNode, in order to track clicks through the touchesBegan method. But all without success
You can use a UITapGesture recognizer to get the 2D point then use SCNSceneRenderer .hitTest(_:options:) to get all of the possible intersections along that ray. Note that the method is on the SCNSceneRenderer protocol, which SCNView conforms to so you may have missed it in the SCNView documentation.
#IBAction func tap(_ recognizer: UITapGestureRecognizer) {
let location = recognizer.location(in: sceneView)
if let firstResult = sceneView.hitTest(location, options: nil).first,
//Do stuff with firstResult here
}
I am playing around with a 360 degree video player using SpriteKit, SceneKit and CoreMotion. The player is working so far, but the video is always zoomed-in a bit. It looks like the camera position is not at (0,0,0), but somehow wrong on the Z axis. Unfortunately, I have not found a way to adjust that. To reproduce the behavior just tap the screen, when the video is playing and pinch-zoom-out. This enables camera control with gestures, double tapping returns to camera control with the device.
import UIKit
import SceneKit
import CoreMotion
import SpriteKit
import AVFoundation
class Video360VC: UIViewController {
let motionManager = CMMotionManager()
let cameraNode = SCNNode()
#IBOutlet weak var sceneView: SCNView!
#IBAction func exitBtnClicked(_ sender: Any) {
performSegueToReturnBack()
}
func createSphereNode(material: AnyObject?) -> SCNNode {
let sphere = SCNSphere(radius: 20.0)
sphere.firstMaterial!.isDoubleSided = true
sphere.firstMaterial!.diffuse.contents = material
let sphereNode = SCNNode(geometry: sphere)
sphereNode.position = SCNVector3Make(0,0,0)
sphereNode.rotation = SCNVector4Make(1, 0, 0, Float.pi)
return sphereNode
}
func configureScene(node sphereNode: SCNNode) {
// Set the scene
let scene = SCNScene()
sceneView.scene = scene
sceneView.showsStatistics = true
sceneView.allowsCameraControl = true
// Camera, ...
cameraNode.camera = SCNCamera()
cameraNode.position = SCNVector3Make(0, 0, 0)
scene.rootNode.addChildNode(sphereNode)
scene.rootNode.addChildNode(cameraNode)
}
func startCameraTracking() {
motionManager.deviceMotionUpdateInterval = 1.0 / 60.0
motionManager.startDeviceMotionUpdates(to: OperationQueue.main) {
[weak self](data: CMDeviceMotion?, error: Error?) in
guard let data = data else { return }
let attitude: CMAttitude = data.attitude
self?.cameraNode.eulerAngles = SCNVector3Make(- Float(attitude.roll + Double.pi/2), Float(attitude.yaw), -Float(attitude.pitch))
}
}
override func viewDidLoad() {
super.viewDidLoad()
guard let fileURL = Bundle.main.url(forResource: "360Test2", withExtension: "mp4") else {
print("Video File not found")
return
}
let player = AVPlayer(url: fileURL)
let videoNode = SKVideoNode(avPlayer: player)
let size = CGSize(width: 1280, height: 720)
videoNode.size = size
videoNode.position = CGPoint(x: size.width/2, y: size.height/2)
let spriteScene = SKScene(size: size)
spriteScene.scaleMode = .resizeFill
spriteScene.addChild(videoNode)
let sphereNode = createSphereNode(material:spriteScene)
configureScene(node: sphereNode)
guard motionManager.isDeviceMotionAvailable else {
fatalError("Device motion is not available")
}
startCameraTracking()
player.play()
}
override func viewWillAppear(_ animated: Bool) {
self.sceneView.play(self)
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
}
}
I used this video for testing, which can also be downloaded here.
Thanks a lot !
From what i can gather from your explanation is, that you have a problem with the so called Field of View. It may look like the camera is zoomed but actually it is not. Look for the FOV settings in your camera try settings like 90 degree.
I have a collectionViewCell that either plays a video or displays and image.Now the elements in the cell are generated programatically. I added tap gesture to toggle the sound when video play. The gesture recognizer wasn't getting called. I tried to place a button in story and get its action, that also didn't recieve a call. Then, I tried to place a view inside the cell, that also didn't display.
Here is my code with tap gesture:
import UIKit
import AVKit
import AVFoundation
#IBDesignable class CHCollectionImageCell: UICollectionViewCell {
// MARK: Properties
var imgView: UIImageView! = UIImageView()
var screenWidth:CGFloat = 0
var screenHeight:CGFloat = 0
var playerLayer: AVPlayerLayer!
let tapOnCell = UITapGestureRecognizer(target: self, action: #selector (CHCollectionImageCell.changeMuteRegimeVideo))
// MARK: Functions
override func awakeFromNib() {
super.awakeFromNib()
}
func configureCell(insight: InsightModel) {
imgView.removeFromSuperview()
if playerLayer != nil {
playerLayer.removeFromSuperlayer()
playerLayer = nil
}
self.removeGestureRecognizer(tapOnCell)
if insight.isVideo {
guard let unwrappedVideoURLString = insight.videoURL,
let unwrappedVideoURL = URL(string: unwrappedVideoURLString) else {
return
}
let playerItem = AVPlayerItem(url: unwrappedVideoURL)
let player = AVPlayer(playerItem: playerItem)
playerLayer = AVPlayerLayer(player: player)
playerLayer.frame = self.bounds
player.isMuted = false
layer.addSublayer(playerLayer)
addGestureRecognizer(self.tapOnCell)
} else {
imgView.frame = CGRect(x: 0, y: 0, width: self.frame.size.width, height: self.frame.size.width)
imgView.image = UIImage(named: "stone")
imgView.contentMode = UIViewContentMode.scaleAspectFill
imgView.clipsToBounds = true
clipsToBounds = true
addSubview(self.imgView)
}
}
/*
#IBAction func tapToTurnOfSound(_ sender: Any) {
if isInsightViedo{
if let unwrappedPlayer = playerLayer.player {
unwrappedPlayer.isMuted = !unwrappedPlayer.isMuted
}
}
//Even Tried adding view as below in the cell
//let tapView = UIView()
//tapView.backgroundColor = ColorCodes.appThemeColor
//self.addSubview(tapView)
//self.bringSubview(toFront: tapView)
//tapView.addGestureRecognizer(tapOnCell)
}
*/
func configureCell() {
imgView.removeFromSuperview()
if playerLayer != nil {
playerLayer.removeFromSuperlayer()
playerLayer = nil
}
self.removeGestureRecognizer(tapOnCell)
imgView.frame = CGRect(x: 0, y: 0, width: self.frame.size.width, height: self.frame.size.width)
imgView.image = UIImage(named: "stone")
imgView.contentMode = UIViewContentMode.scaleAspectFill
imgView.clipsToBounds = true
clipsToBounds = true
addSubview(self.imgView)
}
func changeMuteRegimeVideo() {
if let unwrappedPlayer = playerLayer.player {
unwrappedPlayer.isMuted = !unwrappedPlayer.isMuted
}
}
}
Iam doing the same thing in my application by using the following code :
let longPressGesture:UILongPressGestureRecognizer = UILongPressGestureRecognizer(target: self, action: #selector(viewController.longPress(_:)))
longPressGesture.minimumPressDuration = 0.8
longPressGesture.delegate = self
collectionView.addGestureRecognizer(longPressGesture)
and then call the function:
func longPress(_ longPressGestureRecognizer: UILongPressGestureRecognizer) {
if longPressGestureRecognizer.state == UIGestureRecognizerState.began {
let touchPoint = longPressGestureRecognizer.location(in: collectionView)
if eventsTableView.indexPathForRow(at: touchPoint) != nil {
let index = eventsTableView.indexPathForRow(at: touchPoint)//do whatever you want to do with this index
}}}
you can do whatever you want to do in this function. In my case i used this to enlarge the image in the collection view