Which spritekit or sceneKit node should I use to display a web page in Arkit? As we usually embed other iOS views in arplane as materials.
I got empty pages containing background color of the loaded site after I tried to directly add my WKWebView instance to my node like this:
node.geometry?.firstMaterial?.diffuse.contents = webView
The only way I could show a WKWebView content in my ARKit 3D environment was to load the webView, wait for it to fully load and then take a screenshot of the loaded page and add it to my node.
class ViewController: UIViewController, ARSCNViewDelegate, WKNavigationDelegate {
#IBOutlet var sceneView: ARSCNView!
let webView = WKWebView(frame: CGRect(x: 0, y: 0, width: 500, height: 500), configuration: WKWebViewConfiguration())
...
override func viewDidLoad() {
super.viewDidLoad()
...
webView.navigationDelegate = self
let url = URL(string: "https://stackoverflow.com")!
webView.load(URLRequest(url: url))
}
// WKWebView delegate method
func webView(_ webView: WKWebView, didFinish navigation: WKNavigation!) {
// Page should be loaded by now. However sometimes it took even more to load, so you can use delay and wait more, then take the screenshot.
let screenshot = webView.screenshot()
let node = SCNNode()
node.geometry = SCNPlane(width: 2, height: 2)
node.geometry?.firstMaterial?.diffuse.contents = screenshot
node.geometry?.firstMaterial?.isDoubleSided = true
node.position = SCNVector3(0.1, 0.2, -2)
self.sceneView.scene.rootNode.addChildNode(node)
}
}
screenshot method for webView:
extension WKWebView {
func screenshot() -> UIImage? {
UIGraphicsBeginImageContextWithOptions(self.bounds.size, true, 0);
self.drawHierarchy(in: self.bounds, afterScreenUpdates: true);
let snapshotImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return snapshotImage;
}
}
WKWebView didn't work well with ARKit. UIWebView can show the contents in ARKit.
SpriteKit is used for 2D views and SceneKit for 3D views, so SpriteKit would be more appropriate. Although you can use 2D planes in 3D
Related
When navigating to a subview, I have set it up so that a video plays automatically. At the bottom of the video, there is a group of links that go to related content. When clicking one of them, a new view is pushed onto the stack and a different video starts playing.
The problem happens when using the automatically generated '< Back' button to go back to the prior view (which had a different video). This original view can be operated using the player controls, but nothing shows up on the screen.
I've tried to update the CGRect frame, use onAppear to reinitialize the video player, and also followed the advice here.
So far nothing seems to work. Here is the code I am using for the actual video player (adapted from Chris Mash's website):
import SwiftUI
import AVKit
import UIKit
import AVFoundation
let playerLayer = AVPlayerLayer()
class PlayVideo: UIView {
init(frame: CGRect, url: URL) {
super.init(frame: frame)
// Create the video player using the URL passed in.
let player = AVPlayer(url: url)
player.volume = 100 // Will play audio if you don't set to zero
player.play() // Set to play once created
// Add the player to our Player Layer
playerLayer.player = player
playerLayer.videoGravity = .resizeAspectFill // Resizes content to fill whole video layer.
playerLayer.backgroundColor = UIColor.black.cgColor
layer.addSublayer(playerLayer)
}
required init?(coder: NSCoder) {
super.init(coder: coder)
}
override func layoutSubviews() {
super.layoutSubviews()
playerLayer.frame = bounds
}
static func pauseVideo() {
playerLayer.player?.pause()
}
}
struct ViewVideo: UIViewRepresentable {
var videoURL:URL
var previewLength:Double?
func makeUIView(context: Context) -> UIView {
return PlayVideo(frame: .zero, url: videoURL)
}
func updateUIView(_ uiView: UIView, context: Context) {
}
}
This is called from the main view using:
ViewVideo(videoURL: videoURL)
The only work around I can think of is to disable the back button and force the user to go back to the main view every time. That's a terrible option and I'm hoping someone will have some helpful advice here. Thanks -
If I understand this correctly, you play a different video, when you navigate to the new view. So you create a new PlayVideo view?
Then the problem is that your playerLayer is a static property. The new view will set a new player into the playerLayer and replace the old one. Similarly, if you pause one player, both are paused. Additionally, adding the player as a sublayer to the new view will remove it from the old view.
You need the playerLayer as a local property to your view. Or at least a AVPlayerLayer for every video you want to play. Then you need a mechanism for pausing/restarting each video, when it becomes visible. For example by implementing the viewWillAppear. This method gets always called, when you navigate back to a view/ the view becomes visible.
Thank you to #dominik-105 for helping with this question. I was able to fix the problem using the suggestions that were made.
Specifically, I removed the global definition of playerLayer and instead placed it as a local variable in my main view call:
var playerLayer = AVPlayerLayer()
I then call ViewVideo with the playerLayer: playerLayer tag, and the ViewVideo then calls PlayVideo with playerLayer:AVPlayerLayer as part of the init.
Interestingly, this leads to problems with the override layout function I was using to define the size of the video box. I define the frame directly in the init now and removed the old override function code. The full code is now:
import SwiftUI
import AVKit
import UIKit
import AVFoundation
class PlayVideo: UIView {
init(frame: CGRect, url: URL, playerLayer: AVPlayerLayer, width: CGFloat, height: CGFloat) {
super.init(frame: frame)
// Create the video player using the URL passed in.
let player = AVPlayer(url: url)
player.volume = 100 // Will play audio if you don't set to zero
player.play() // Set to play once created
// Add the player to our Player Layer
playerLayer.player = player
playerLayer.videoGravity = .resizeAspectFill // Resizes content to fill whole video layer.
playerLayer.backgroundColor = UIColor.black.cgColor
playerLayer.player?.actionAtItemEnd = .pause
layer.addSublayer(playerLayer)
playerLayer.frame = CGRect(x: 0, y: 0, width: width, height: height)
}
required init?(coder: NSCoder) {
super.init(coder: coder)
}
}
struct ViewVideo: UIViewRepresentable {
var videoURL:URL
var playerLayer: AVPlayerLayer
var width: CGFloat
var height: CGFloat
func makeUIView(context: Context) -> UIView {
return PlayVideo(frame: .zero, url: videoURL, playerLayer: playerLayer, width: width, height: height)
}
func updateUIView(_ uiView: UIView, context: Context) {
}
}
I use GeometryReader to define the size of the box, and then pass along the width and height to the struct, which passes it along to the class.
Is it possible to put a loading animation over the VNDocumentViewController? As in, when the user presses the Save button, is there a way for me to somehow indicate that the Vision is processing the image and hasn't frozen? Right now, in my app, there is a long pause between the user pressing Save and the actual image being processed.Here is an example from another post of what I'm trying to create
Here is one example of adding a loading indicator using UIActivityIndicatorView().
startAnimating() to start the animation and stopAnimation() to stop the animation.
iOS - Display a progress indicator at the center of the screen rather than the view
guard let topWindow = UIApplication.shared.windows.last else {return}
let overlayView = UIView(frame: topWindow.bounds)
overlayView.backgroundColor = UIColor.clear
topWindow.addSubview(overlayView)
let hudView = UIActivityIndicatorView()
hudView.bounds = CGRect(x: 0, y: 0, width: 20, height: 20)
overlayView.addSubview(hudView)
hudView.center = overlayView.center
hudView.startAnimating()
Alternatively, you could look into using Cocoapod MBProgressHud
https://cocoapods.org/pods/MBProgressHUD
There's a way you can extend a class in Swift that captures this problem well. The idea is you want a UIActivityIndicator in your VNDocumentCameraViewController. But we'd like that to be a part of every version of this we use. We could simply embed the DocumentVC's view into our current view and superimpose a UIActivityIndicator above it in the view stack, but that's pretty hacky. Here's a quick way we can extend any class and solve this problem
import VisionKit
import UIKit
extension VNDocumentCameraViewController {
private struct LoadingContainer {
static var loadingIndicator = UIActivityIndicatorView()
}
var loadingIndicator: UIActivityIndicatorView {
return LoadingContainer.loadingIndicator
}
func animateLoadingIndicator() {
if loadingIndicator.superview == nil {
view.addSubview(loadingIndicator)
//Setup your constraints through your favorite method
//This constrains it to the very center of the controller
loadingIndicator.frame = CGRect(
x: view.frame.width / 2.0,
y: view.frame.height / 2.0,
width: 20,
height: 20)
//Setup additional state like color/etc here
loadingIndicator.color = .white
}
loadingIndicator.startAnimating()
}
func stopAnimatingLoadingIndicator() {
loadingIndicator.stopAnimating()
}
}
The place we can call these functions are in the delegate methods for VNDocumentCameraViewController that you implement in your presenting ViewController:
func documentCameraViewController(
_ controller: VNDocumentCameraViewController,
didFinishWith scan: VNDocumentCameraScan
) {
controller.animateLoadingIndicator()
}
I have successfully to load a pdf file from URL to a web view. I want to get the coordinates after touch on webview to sent it data to server to add some text to that position in server side. Problem is how can I get that coordinates when webview is zooming or scroll to page 2,3..
Could you please help me to solve the problem in Swift 3
Here the code
class ViewController: UIViewController, UITextFieldDelegate, UIGestureRecognizerDelegate, UIWebViewDelegate {
#IBOutlet var imageView: UIImageView!
#IBOutlet var webView: UIWebView!
override func viewDidLoad() {
super.viewDidLoad()
let url = URL(string: "http://www.pdf995.com/samples/pdf.pdf")!
webView.loadRequest(URLRequest(url: url))
let webViewTapped = UITapGestureRecognizer(target: self, action: #selector(self.tapAction(_:)))
webViewTapped.numberOfTouchesRequired = 1
webViewTapped.delegate = self
webView.addGestureRecognizer(webViewTapped)
}
func tapAction(_ sender: UITapGestureRecognizer) {
// This always return the same coordinates in zoom or not zoom
let point = sender.location(in: webView)
print("======")
print("x: \(point.x) y: \(point.y)")
}
PS: I am a newbie in swift
user3783161,
First of all, you can remove the following line, if you don't use any method from UIGestureRecognizerDelegate.
webViewTapped.delegate = self
To get the the actual coordinate when you touch the UIWebView, you need to get the location on UIScrollView and not from UIWebView.
func tapAction(_ sender: UITapGestureRecognizer) {
// This always return the same coordinates in zoom or not zoom
// let point = sender.location(in: webView)
// print("======WebView")
// print("x: \(point.x) y: \(point.y)")
// Get location frown webView.ScrollView
let point = sender.location(in: webView.scrollView)
print("======ScrollView")
print("x: \(point.x) y: \(point.y)")
}
But you still have a problem, because the location is relative to the content size of the UIWebView and not to the PDF document.
I did not find the answer to this, but you can do a simple conversion between:
PDF Paper Size vs. WebView Size
Since the PDF must have this information (the pdf can be A4 / A3 / A5 / C1 / etc).
I am writing a app in Swift where I am trying to load a preview of images using iCarousel of html pages for which I am using UIWebView.
Now the problem I have is I am unable to preview html pages, however when I trying to preview images, I am able to preview it.
Here is my code
func carousel(carousel: iCarousel, viewForItemAtIndex index: Int, reusingView view: UIView?) -> UIView {
var webView : UIImageView!
let webV:UIWebView = UIWebView(frame: CGRectMake(0, 0, 200, 200))
webV.loadRequest(NSURLRequest(URL: NSURL(string: "http://www.google.com")!))
webV.contentMode = .ScaleAspectFit
if webView == nil{
webView = UIImageView(frame:CGRect(x:0,y:0,width:200,height:200))
webView.contentMode = .ScaleAspectFit
}else{
webView = view as! UIImageView
}
webView.image = UIImage(named:"left-arrow.png")!
return webView //returns image carousel
//return webV //returns blank carousel. I want this to work !!!
}
How do I achieve this functionality ?
Edit 1
I can preview images for
carousel.type = iCarouselType.Linear screen attached.
This is how it looks
but I want it for any other carousel, say
carousel.type = iCarouselType.CoverFlow this is what I get
This is what I get when I am returning UIImageView
I have achieved this functionality by using WKWebView instead of UIWebView
Here is the working code
let webV:WKWebView!
webV = WKWebView(frame: CGRectMake(20, 20, 200, 200))
var url = NSURL(string:"http://www.fb.com/")
var req = NSURLRequest(URL:url!)
webV!.loadRequest(req)
return webV
Also,make sure you have imported WebKit by adding the below line
import WebKit
what is the most efficient way to add a GIF/Video to the background of the landing screen ( home screen or first view controller) of my app in Xcode? i.e apps like spotify, uber, insagram etc. Being that my app is universal, how would i make it fit accordingly?
Do you mean the first screen that is displayed after your app is launched? If so: unfortunately you can't have dynamic content; you won't be able to use a gif/video.
That said, what you can do if you have some app-setup on background threads that will take some time anyway, or if you simply want the user to wait longer before interaction so that you can display the gif/video, you can make the static image match the first frame of the gif/video, and have your your entry point be a ViewController that displays the actual gif/video. Because this would delay the time to interaction, though, this would never be recommended.
As for making it fit: as of iOS 8 Apple recommends using LaunchScreen.xib. With it you can use Auto Layout to achieve universality.
To add a video you can use MPMoviePlayerController, AVPlayer, or if you're using SPritekit you can use an SKVideoNode.
EDIT (in response to follow-up comments):
An NSURL is a reference to a local or remote file. This link will give you a decent overview. Just copy the movie in and follow that guide.
In addition to the MPMoviePlayerController solution Saqib Omer suggested, here's an alternative method that uses a UIView with an AVPlayerLayer. It has a button on top of the video as an example, since that's what you're looking for.
import AVKit
import AVFoundation
import UIKit
class ViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
// Start with a generic UIView and add it to the ViewController view
let myPlayerView = UIView(frame: self.view.bounds)
myPlayerView.backgroundColor = UIColor.blackColor()
view.addSubview(myPlayerView)
// Use a local or remote URL
let url = NSURL(string: "http://eoimages.gsfc.nasa.gov/images/imagerecords/76000/76740/iss030-e-6082_sdtv.mov") // See the note on NSURL above.
// Make a player
let myPlayer = AVPlayer(URL: url)
myPlayer.play()
// Make the AVPlayerLayer and add it to myPlayerView's layer
let avLayer = AVPlayerLayer(player: myPlayer)
avLayer.frame = myPlayerView.bounds
myPlayerView.layer.addSublayer(avLayer)
// Make a button and add it to myPlayerView (you'd need to add an action, of course)
let myButtonOrigin = CGPoint(x: myPlayerView.bounds.size.width / 3, y: myPlayerView.bounds.size.height / 2)
let myButtonSize = CGSize(width: myPlayerView.bounds.size.width / 3, height: myPlayerView.bounds.size.height / 10)
let myButton = UIButton(frame: CGRect(origin: myButtonOrigin, size: myButtonSize))
myButton.setTitle("Press Me!", forState: .Normal)
myButton.setTitleColor(UIColor.whiteColor(), forState: .Normal)
myPlayerView.addSubview(myButton)
}
}
For playing video add following code, declare class variable var moviePlayer:MPMoviePlayerController! . Than in your viewDidLoad()
var url:NSURL = NSURL(string: "YOUR_URL_FOR_VIDEO")
moviePlayer = MPMoviePlayerController(contentURL: url)
moviePlayer.view.frame = CGRect(x: 0, y: 0, width: 200, height: 150)
self.view.addSubview(moviePlayer.view)
moviePlayer.fullscreen = true
moviePlayer.controlStyle = MPMovieControlStyle.Embedded
This will play video. But to fit it you need to add layout contraints. See this link to add constraints pragmatically.
import MediaPlayer
class ViewController: UIViewController {
var moviePlayer: MPMoviePlayerController!
override func viewDidLoad() {
super.viewDidLoad()
// Load the video from the app bundle.
let videoURL: NSURL = NSBundle.mainBundle().URLForResource("video", withExtension: "mp4")!
// Create and configure the movie player.
self.moviePlayer = MPMoviePlayerController(contentURL: videoURL)
self.moviePlayer.controlStyle = MPMovieControlStyle.None
self.moviePlayer.scalingMode = MPMovieScalingMode.AspectFill
self.moviePlayer.view.frame = self.view.frame
self.view .insertSubview(self.moviePlayer.view, atIndex: 0)
self.moviePlayer.play()
// Loop video.
NSNotificationCenter.defaultCenter().addObserver(self, selector: "loopVideo", name: MPMoviePlayerPlaybackDidFinishNotification, object: self.moviePlayer)
}
func loopVideo() {
self.moviePlayer.play()
}
https://medium.com/#kschaller/ios-video-backgrounds-6eead788f190#.2fhxmc2da