How to play hosted video on an image with Arkit in Swift? - ios

I can play the video with no issue when the video is locally stored but now I want to play a video that has been hosted on my google drive.
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
let configuration = ARImageTrackingConfiguration()
if let trackedImages = ARReferenceImage.referenceImages(inGroupNamed: "TestImages", bundle: Bundle.main){
configuration.trackingImages = trackedImages
configuration.maximumNumberOfTrackedImages = 1
}
// Run the view's session
ARView.session.run(configuration)
}
Here is my function where the video is rendered onto the image:
func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? {
let node = SCNNode()
if let imageAnchor = anchor as? ARImageAnchor{
let videoNode = SKVideoNode(fileNamed: "Test.mp4")
videoNode.play()
let videoScene = SKScene(size: CGSize(width: 1080, height:720 ))
videoScene.addChild(videoNode)
let plane = SCNPlane(width: imageAnchor.referenceImage.physicalSize.width, height: imageAnchor.referenceImage.physicalSize.height)
videoNode.position = CGPoint(x: videoScene.size.width/2, y: videoScene.size.height/2)
videoNode.yScale = -1.0
plane.firstMaterial?.diffuse.contents = videoScene
let planeNode = SCNNode(geometry: plane)
planeNode.eulerAngles.x = -.pi/2
node.addChildNode(planeNode)
}
return node
}
I am new to ARKIT so I am only learning the basic functions and enjoying working with them.

Your code looks just fine. Only thing that you need to change is the line that you've defined the videoNode
Example:
let videoNode = SKVideoNode(url: URL(fileURLWithPath: "https://clips.vorwaerts-gmbh.de/big_buck_bunny.mp4"))
Hope this helps!

Related

Initializing SKVideoNode using AVPlayer object - SWIFT

I'm playing around with ARKit and have created an app which overlays video content onto recognized images. Code is as follows:
import UIKit
import SceneKit
import ARKit
import SpriteKit
class ViewController: UIViewController, ARSCNViewDelegate {
#IBOutlet var sceneView: ARSCNView!
override func viewDidLoad() {
super.viewDidLoad()
sceneView.delegate = self
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
let configuration = ARImageTrackingConfiguration()
if let imageToTrack = ARReferenceImage.referenceImages(inGroupNamed: "ALLimages", bundle: Bundle.main) {
configuration.trackingImages = imageToTrack
configuration.maximumNumberOfTrackedImages = 3
print("Images successfully added")
}
sceneView.session.run(configuration)
}
override func viewWillDisappear(_ animated: Bool) {
super.viewWillDisappear(animated)
sceneView.session.pause()
}
// MARK: - ARSCNViewDelegate
func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? {
let node = SCNNode()
var videoNode = SKVideoNode()
var videoScene = SKScene()
if let imageAnchor = anchor as? ARImageAnchor {
videoNode = SKVideoNode(fileNamed: "video1.mp4")
videoNode.play()
videoScene = SKScene(size: CGSize(width: 640, height: 360))
videoNode.position = CGPoint(x: videoScene.size.width / 2, y: videoScene.size.height / 2)
videoNode.yScale = -1.0
videoScene.addChild(videoNode)
let plane = SCNPlane(width: imageAnchor.referenceImage.physicalSize.width, height: imageAnchor.referenceImage.physicalSize.height)
plane.firstMaterial?.diffuse.contents = videoScene
let planeNode = SCNNode(geometry: plane)
planeNode.eulerAngles.x = -.pi/2
node.addChildNode(planeNode)
}
return node
}
}
This works perfectly... once! But the problem is that once the video has finished playing, because of the limited controls available via SKVideoNode I cannot figure out how to restart the video automatically. Ideally these should play on a loop.
I've done some research and it seems that the best way is to initialize my video node using an AVPlayer object.
So, I attempted to do this but cannot get it working.
I added var player = AVPlayer() in my class and then tried to initialize my videoNode as follows:
var videoNode: SKVideoNode? = {
guard let urlString = Bundle.main.path(forResource: "video1", ofType: "mov") else {
return nil
}
let url = URL(fileURLWithPath: urlString)
let item = AVPlayerItem(url: url)
let player = AVPlayer(playerItem: item)
return SKVideoNode(avPlayer: player)
}()
I then attempted to user player.play() but the video never plays. Instead my plane just appears as a blank rectangle over my images.
Once I successfully initialize this I think I'm able to add an observer to check when the video has finished player and restart it, but I'm struggling to get to that point.
First, you don't need SKVideoNode in a SKScene in a SCNNode. You can directly use AVPlayer as diffuse content of you SCNNode :
plane.firstMaterial?.diffuse.contents = player
For looping you have to subscribe to a notification on the player and reset time to zero at end :
NotificationCenter.default.addObserver(forName: .AVPlayerItemDidPlayToEndTime, object: player.currentItem, queue: nil) { _ in
player.seek(to: .zero)
player.play()
}

Why isn't my ARKit Video showing in app view

I have an ARKit that recognises two different tracking images
Vuforia and Tracker
It works when I have a different .scn for each tracker, and it hops between the two.
I am trying to get the Tracker image to overlay a video player,
I have tried everything I can think of but am now stuck.
I have checked all the file names and links, but think I must be missing something to get the video player to fire up.
Help!
import UIKit
import SceneKit
import ARKit
class ThirdViewController: UIViewController, ARSCNViewDelegate {
#IBOutlet var sceneView: ARSCNView!
let exampleVideoPlayer: AVPlayer = {
//load example video from bundle
guard let url = Bundle.main.url(forResource: "homer video", withExtension: "mov", subdirectory: "AR.scnassets") else {
print("Could not find video file")
return AVPlayer()
}
return AVPlayer(url: url)
}()
var FreemensNode: SCNNode?
var VideoNode: SCNNode?
override func viewDidLoad() {
super.viewDidLoad()
// Set the view's delegate
sceneView.delegate = self
sceneView.autoenablesDefaultLighting = true
let FreemensScene = SCNScene(named: "AR.scnassets/Freemens.scn")
let VideoScene = SCNScene(named: "AR.scnassets/Video.scn")
FreemensNode = FreemensScene?.rootNode
VideoNode = VideoScene?.rootNode
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
let configuration = ARImageTrackingConfiguration()
if let trackingImages = ARReferenceImage.referenceImages(inGroupNamed: "Photos", bundle: Bundle.main){
configuration.trackingImages = trackingImages
configuration.maximumNumberOfTrackedImages = 2
}
// Run the view's session
sceneView.session.run(configuration)
}
override func viewWillDisappear(_ animated: Bool) {
super.viewWillDisappear(animated)
sceneView.session.pause()
}
// MARK: - ARSCNViewDelegate
public func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? {
let node = SCNNode()
if let imageAnchor = anchor as? ARImageAnchor {
// Create a plane
let plane = SCNPlane(width: imageAnchor.referenceImage.physicalSize.width, height: imageAnchor.referenceImage.physicalSize.height)
if imageAnchor.referenceImage.name == "Tracker" {
// Set AVPlayer as the plane's texture and play
plane.firstMaterial?.diffuse.contents = self.exampleVideoPlayer
self.exampleVideoPlayer.play()
} else if imageAnchor.referenceImage.name == "Vuforia" {
plane.firstMaterial?.diffuse.contents = UIColor(white: 1, alpha: 0.0)
let planeNode = SCNNode(geometry: plane)
planeNode.eulerAngles.x = -.pi / 2
node.addChildNode(planeNode)
}
var shapeNode: SCNNode?
switch imageAnchor.referenceImage.name {
case ArItem.Freemens.rawValue :
shapeNode = FreemensNode
case ArItem.Video.rawValue :
shapeNode = VideoNode
default:
break
}
if imageAnchor.referenceImage.name == "Vuforia" {
shapeNode = FreemensNode
} else if imageAnchor.referenceImage.name == "Tracker"{
shapeNode = VideoNode
}
guard let shape = shapeNode else { return nil}
node.addChildNode(shape)
}
return node
}
enum ArItem : String {
case Freemens = "Freemens"
case Video = "Video"
}
}
full page code here
import UIKit
import SceneKit
import ARKit
class ThirdViewController: UIViewController, ARSCNViewDelegate {
#IBOutlet var sceneView: ARSCNView!
var FreemensNode: SCNNode?
var VideoNode: SCNNode?
override func viewDidLoad() {
super.viewDidLoad()
// Set the view's delegate
sceneView.delegate = self
sceneView.autoenablesDefaultLighting = true
let FreemensScene = SCNScene(named: "ar.scnassets/Freemens.scn")
let VideoScene = SCNScene(named: "ar.scnassets/Video.scn")
FreemensNode = FreemensScene?.rootNode
VideoNode = VideoScene?.rootNode
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
let configuration = ARImageTrackingConfiguration()
if let trackingImages = ARReferenceImage.referenceImages(inGroupNamed: "Photos", bundle: Bundle.main){
configuration.trackingImages = trackingImages
configuration.maximumNumberOfTrackedImages = 2
}
// Run the view's session
sceneView.session.run(configuration)
}
override func viewWillDisappear(_ animated: Bool) {
super.viewWillDisappear(animated)
sceneView.session.pause()
}
// MARK: - ARSCNViewDelegate
func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? {
let node = SCNNode()
if let imageAnchor = anchor as? ARImageAnchor {
if let fileString = Bundle.main.path(forResource: "black", ofType: "mp4") {
let videoItem = AVPlayerItem(url: URL(fileURLWithPath: fileString))
let player = AVPlayer(playerItem: videoItem)
//initialize video node with avplayer
let videoNode = SKVideoNode(avPlayer: player)
player.play()
// add observer when our player.currentItem finishes player, then start playing from the beginning
NotificationCenter.default.addObserver(forName: .AVPlayerItemDidPlayToEndTime, object: player.currentItem, queue: nil) { (notification) in
player.seek(to: CMTime.zero)
player.play()
print("Looping Video")
}
// set the size (just a rough one will do)
let videoScene = SKScene(size: CGSize(width: 480, height: 360))
// center our video to the size of our video scene
videoNode.position = CGPoint(x: videoScene.size.width / 2, y: videoScene.size.height / 2)
// invert our video so it does not look upside down
videoNode.yScale = -1.0
// add the video to our scene
videoScene.addChild(videoNode)
// Create a plane
let plane = SCNPlane(width: imageAnchor.referenceImage.physicalSize.width, height: imageAnchor.referenceImage.physicalSize.height)
if imageAnchor.referenceImage.name == "Tracker" {
// Set AVPlayer as the plane's texture and play
plane.firstMaterial?.diffuse.contents = videoScene
} else if imageAnchor.referenceImage.name == "Vuforia" {
plane.firstMaterial?.diffuse.contents = UIColor(white: 1, alpha: 0.0)
let planeNode = SCNNode(geometry: plane)
planeNode.eulerAngles.x = -.pi / 2
node.addChildNode(planeNode)
}
var shapeNode: SCNNode?
switch imageAnchor.referenceImage.name {
case ArItem.Freemens.rawValue :
shapeNode = FreemensNode
case ArItem.Video.rawValue :
shapeNode = VideoNode
default:
break
}
if imageAnchor.referenceImage.name == "Vuforia" {
shapeNode = FreemensNode
} else if imageAnchor.referenceImage.name == "Tracker"{
shapeNode = VideoNode
}
guard let shape = shapeNode else { return nil}
node.addChildNode(shape)
}
return node
}
enum ArItem : String {
case Freemens = "Freemens"
case Video = "Video"
}
}
}
I've had success with this code
// setup AV player & create SKVideoNode from avPlayer
let videoURL = URL(fileURLWithPath: Bundle.main.path(forResource: videoAssetName, ofType: videoAssetExtension)!)
let player = AVPlayer(url: videoURL)
player.actionAtItemEnd = .none
videoPlayerNode = SKVideoNode(avPlayer: player)
// setup player
let skSceneSize = orientation == .horizontal ? CGSize(width: 1280, height: 720) : CGSize(width: 406, height: 720)
let skScene = SKScene(size: skSceneSize)
skScene.addChild(videoPlayerNode)
videoPlayerNode.position = CGPoint(x: skScene.size.width/2, y: skScene.size.height/2)
videoPlayerNode.size = skScene.size
let scnPlaneSize : [String : CGFloat] = orientation == .horizontal ? ["width": 0.9, "height": 0.5063] : ["width": 0.5063, "height": 0.9]
let videoPlane = SCNPlane(width: scnPlaneSize["width"]!, height: scnPlaneSize["height"]!)
videoPlane.firstMaterial?.diffuse.contents = skScene
videoPlane.firstMaterial?.isDoubleSided = true
let videoPlaneNode = SCNNode(geometry: videoPlane)
node.addChildNode(videoPlaneNode)
// setup node to auto remove itself upon completion
NotificationCenter.default.addObserver(forName: .AVPlayerItemDidPlayToEndTime, object: player.currentItem, queue: nil, using: { (_) in
DispatchQueue.main.async {
if self.debug { NSLog("video completed") }
// do something when the video ends
}
})
// play the video node
videoPlayerNode.play()

No preview with the downloaded image in ARKIT

I have been working on an ARKit app and I achieved my goal of detecting the pictures in the scene and playback video in the scene.
The problem occurred is when I tried to fetch the image from the internet.
* The image got detected and started playback (I was able to hear the audio) but never showed any video on the scene.(I reverted the code below to from where I started)
* What I actually want is to update the reference images and the videos of the playback on the go when my app is in the App Store.
KINDLY TELL THE BEST SOLUTION... THANKS
Below is my complete code:
import UIKit
import SceneKit
import ARKit
import Alamofire
import AlamofireImage
class ViewController: UIViewController, ARSCNViewDelegate {
#IBOutlet var sceneView: ARSCNView!
var imageServer = [UIImage]()
var trackedImages = Set<ARReferenceImage>()
let configuration = ARImageTrackingConfiguration()
let videoNode = SKVideoNode(url: URL(fileURLWithPath: "https://example.com/video1.mp4"))
override func viewDidLoad() {
super.viewDidLoad()
sceneView.delegate = self
sceneView.showsStatistics = true
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
fetchImage {
self.configuration.trackingImages = self.trackedImages
self.configuration.maximumNumberOfTrackedImages = 1
self.sceneView.session.run(self.configuration)
}
}
override func viewWillDisappear(_ animated: Bool) {
super.viewWillDisappear(animated)
sceneView.session.pause()
}
func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? {
let node = SCNNode()
if let imageAnchor = anchor as? ARImageAnchor {
videoNode.play()
let videoScene = SKScene(size: CGSize(width: 480, height: 360))
videoNode.position = CGPoint(x: videoScene.size.width / 2, y: videoScene.size.height / 2)
videoNode.yScale = -1.0
videoScene.addChild(videoNode)
let plane = SCNPlane(width: imageAnchor.referenceImage.physicalSize.width, height: imageAnchor.referenceImage.physicalSize.height)
plane.firstMaterial?.diffuse.contents = videoScene
let planeNode = SCNNode(geometry: plane)
planeNode.eulerAngles.x = -.pi / 2
node.addChildNode(planeNode)
}
return node
}
func fetchImage(completion: #escaping ()->()) {
Alamofire.request("https://example.com/four.png").responseImage { response in
debugPrint(response)
print(response.request as Any)
print(response.response as Any)
debugPrint(response.result)
if let image = response.result.value {
print("image downloaded: \(image)")
self.imageServer.append(image)
print("ImageServer append Successful")
print("The new number of images = \(self.imageServer.count)")
}
completion()
}
}
}

Stop SKVideoNode Playing When Not In View Of Camera

I use SpriteKit image recognition. SpriteKit works normally and I can image recognition. After recognition, I add a video on that picture. When the camera changes its direction and the videoNode is not in view, How can I remove the video?
simple code:
override func viewDidLoad() {
super.viewDidLoad()
guard let referenceImages = ARReferenceImage.referenceImages(inGroupNamed: "Room", bundle: nil) else {
fatalError("Missing expected asset catalog resources.")
}
let configuration = ARWorldTrackingConfiguration()
configuration.detectionImages = referenceImages
sceneView.session.run(configuration, options: [.resetTracking, .removeExistingAnchors])
}
func view(_ view: ARSKView, didAdd node: SKNode, for anchor: ARAnchor) {
if let imageAnchor = anchor as? ARImageAnchor,
self.myImage == imageAnchor.referenceImage {
node.addChild(self.addNode())
}
}
private func addNode () -> SKVideoNode {
let video = SKVideoNode(url: URL(string: "http://techslides.com/demos/sample-videos/small.mp4")!)
video.size = CGSize(width: 40, height: 40)
video.position = CGPoint(x: 0, y: 0)
video.play()
return video
}
Just do camera.contains(videoNode) If you want to check if the videoNode is in the camera or not.
basically on camera moved (Wherever you are moving the camera) do:
if !scene.camera.contains(videoNode){
videoNode.pause()
}

ARkit image recognition and AR visualization of the image

In the company where I work I need to do this application. I have to recognize an image of a painting, and to visualize it in AR once recognized (in practice I find the real picture and above the painting in AR) I visualize the text or the selectable points with various characteristics of the picture in question. At the moment I have this code for the AR that recognizes the image in question and I visualize a plan above it. can you help me to create maybe a view above the picture with the features listed above?
import ARKit
import SceneKit
import UIKit
class ViewController: UIViewController, ARSCNViewDelegate {
#IBOutlet var sceneView: ARSCNView!
#IBOutlet weak var blurView: UIVisualEffectView!
/// The view controller that displays the status and "restart experience" UI.
lazy var statusViewController: StatusViewController = {
return childViewControllers.lazy.compactMap({ $0 as? StatusViewController }).first!
}()
/// A serial queue for thread safety when modifying the SceneKit node graph.
let updateQueue = DispatchQueue(label: Bundle.main.bundleIdentifier! +
".serialSceneKitQueue")
/// Convenience accessor for the session owned by ARSCNView.
var session: ARSession {
return sceneView.session
}
// MARK: - View Controller Life Cycle
override func viewDidLoad() {
super.viewDidLoad()
sceneView.delegate = self
sceneView.session.delegate = self
// Hook up status view controller callback(s).
statusViewController.restartExperienceHandler = { [unowned self] in
self.restartExperience()
}
}
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
// Prevent the screen from being dimmed to avoid interuppting the AR experience.
UIApplication.shared.isIdleTimerDisabled = true
// Start the AR experience
resetTracking()
}
override func viewWillDisappear(_ animated: Bool) {
super.viewWillDisappear(animated)
session.pause()
}
// MARK: - Session management (Image detection setup)
/// Prevents restarting the session while a restart is in progress.
var isRestartAvailable = true
/// Creates a new AR configuration to run on the `session`.
/// - Tag: ARReferenceImage-Loading
func resetTracking() {
guard let referenceImages = ARReferenceImage.referenceImages(inGroupNamed: "AR Resources", bundle: nil) else {
fatalError("Missing expected asset catalog resources.")
}
let configuration = ARWorldTrackingConfiguration()
configuration.detectionImages = referenceImages
session.run(configuration, options: [.resetTracking, .removeExistingAnchors])
statusViewController.scheduleMessage("Look around to detect images", inSeconds: 7.5, messageType: .contentPlacement)
}
// MARK: - ARSCNViewDelegate (Image detection results)
/// - Tag: ARImageAnchor-Visualizing
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
guard let imageAnchor = anchor as? ARImageAnchor else { return }
let referenceImage = imageAnchor.referenceImage
updateQueue.async {
// Create a plane to visualize the initial position of the detected image.
let plane = SCNPlane(width: referenceImage.physicalSize.width,
height: referenceImage.physicalSize.height)
let planeNode = SCNNode(geometry: plane)
planeNode.opacity = 0.25
/*
`SCNPlane` is vertically oriented in its local coordinate space, but
`ARImageAnchor` assumes the image is horizontal in its local space, so
rotate the plane to match.
*/
planeNode.eulerAngles.x = -.pi / 2
/*
Image anchors are not tracked after initial detection, so create an
animation that limits the duration for which the plane visualization appears.
*/
planeNode.runAction(self.imageHighlightAction)
// Add the plane visualization to the scene.
node.addChildNode(planeNode)
}
DispatchQueue.main.async {
let imageName = referenceImage.name ?? ""
self.statusViewController.cancelAllScheduledMessages()
self.statusViewController.showMessage("Detected image “\(imageName)”")
}
}
var imageHighlightAction: SCNAction {
return .sequence([
.wait(duration: 0.25),
.fadeOpacity(to: 0.85, duration: 0.25),
.fadeOpacity(to: 0.15, duration: 0.25),
.fadeOpacity(to: 0.85, duration: 0.25),
.fadeOut(duration: 0.5),
.removeFromParentNode()
])
}
}
One way to tackle your problem is to create an SKScene and render it as an SCNMaterial.
Here is a fully commented example for you which makes use of the following ARSCNViewDelegate method:
//--------------------------
// MARK: - ARSCNViewDelegate
//--------------------------
extension ViewController: ARSCNViewDelegate {
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
//1. Check We Have An ARImageAnchor And Have Detected Our Reference Image
guard let imageAnchor = anchor as? ARImageAnchor else { return }
let referenceImage = imageAnchor.referenceImage
//2. Get The Physical Width & Height Of Our Reference Image
let width = CGFloat(referenceImage.physicalSize.width)
let height = CGFloat(referenceImage.physicalSize.height)
//3. Create An SKScene Which We Will Display Above Our Target
let overlayNode = SCNNode()
let spriteKitScene = SKScene(size: CGSize(width: 600, height: 300))
spriteKitScene.backgroundColor = .clear
let imageName = SKLabelNode(text: "Line Friends")
imageName.position = CGPoint(x: 600/2, y: 240)
imageName.horizontalAlignmentMode = .center
imageName.fontSize = 60
imageName.fontName = "San Fransisco"
spriteKitScene.addChild(imageName)
let artistName = SKLabelNode(text: "By Line Coorporation")
artistName.position = CGPoint(x: 600/2, y: 180)
artistName.horizontalAlignmentMode = .center
artistName.fontSize = 60
artistName.fontName = "San Fransisco"
spriteKitScene.addChild(artistName)
let creationDate = SKLabelNode(text: "Created 2011")
creationDate.position = CGPoint(x: 600/2, y: 120)
creationDate.horizontalAlignmentMode = .center
creationDate.fontSize = 60
creationDate.fontName = "San Fransisco"
spriteKitScene.addChild(creationDate)
let planeHeight = height/2
let overlayGeometry = SCNPlane(width: width, height: planeHeight)
overlayNode.geometry = overlayGeometry
//4. Add The SpriteKit Scene As The SCNPlane's Geometry
overlayGeometry.firstMaterial?.diffuse.contents = spriteKitScene
//5. Rotate The Material Contents So It Isn't Backwards
overlayGeometry.firstMaterial?.diffuse.contentsTransform = SCNMatrix4Translate(SCNMatrix4MakeScale(1, -1, 1), 0, 1, 0)
//6. Rotate The Node
overlayNode.transform = SCNMatrix4MakeRotation(-Float.pi / 2, 1, 0, 0)
//7. Place It About The Target
let zPosition = height - (planeHeight/2)
overlayNode.position = SCNVector3(0, 0, -zPosition)
//8. Add It To The Scene
node.addChildNode(overlayNode)
}
}
Which yields something like the following:
Obviously if you have multiple image targets, you would create a func to dynamically create your 'info' overlay...
Hope it points you in the right direction... And apologies for the heinous spelling of corporation! ^_______*

Resources