I need to create WebAR using iPhone 12's LiDAR sensor.
Is that possible to get permission or API to access it?
Kindly suggest me the good reference for my requirement.
AR QuickLook content implementation
In 2019 Apple released AR Quick Look framework allowing you to create a web-based Augmented Reality experience browsing Safari. QuickLook is based on RealityKit engine, it's easy to implement and conveniently to use. It automatically uses LiDAR Scanner if your iPhone has it. If there's no LiDAR Scanner on-board, it runs regular plane detection feature.
Here's a Swift sample code for native Xcode project:
import ARKit
import QuickLook
extension ViewController: QLPreviewControllerDelegate,
QLPreviewControllerDataSource {
func numberOfPreviewItems(in controller: QLPreviewController) -> Int {
return 1
}
func previewController(_ controller: QLPreviewController,
previewItemAt index: Int) -> QLPreviewItem {
guard let path = Bundle.main.path(forResource: "file", ofType: "usdz")
else { fatalError("Couldn't find a model") }
let url = URL(fileURLWithPath: path)
return url as QLPreviewItem
}
}
class ViewController: UIViewController {
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
let previewController = QLPreviewController()
previewController.delegate = self
previewController.dataSource = self
self.present(previewController, animated: true, completion: nil)
}
}
Web AR content implementation
To activate your USDZ model through web resource, use the following HTML tags:
<div>
<a rel="ar" href="/assets/models/bicycle.usdz">
<img src="/assets/models/bicycle-image.jpg">
</a>
</div>
You cannot access the LiDAR scanner's parameters when you enable AR Quick Look via iOS Safari. If your iPhone has a LiDAR on-board, it will be used automatically.
Related
I’ve been using the agora SDK (audio only) for a while now, and I’m very pleased! Users can enter rooms of max 8 people and talk to each other. Now, I’m supposed to add a video feature, so they can enable their video stream at any time they want.
I’ve added the AgoraRtcEngine_iOS 3.7.0 pod and this is how I initialize the agora engine:
agoraKit = AgoraRtcEngineKit.sharedEngine(withAppId: AppKeys.agoraAppId, delegate: self)
agoraKit?.setChannelProfile(.liveBroadcasting)
agoraKit?.setClientRole(.broadcaster)
agoraKit?.enableVideo()
agoraKit?.muteLocalVideoStream(true)
agoraKit?.muteLocalAudioStream(true)
agoraKit?.enableAudioVolumeIndication(1000, smooth: 3, report_vad: true)
This way, when a user joins a room, they have muted their audio and video streams, so they can enable it whenever they’re ready. The thing is, I’m using a UICollectionView to present the broadcasters. (everyone's user cell is visible at all times, so no reuse takes place, but collection view reloads happen constantly)
This is part of the cell setup (cellForItemAt), that handles the video view using a delegate:
private func setupVideoView(uid: UInt, isOfCurrentUser: Bool) {
videoView.frame.size = avatarBackgroundView.frame.size
videoView.layer.cornerRadius = avatarBackgroundView.layer.cornerRadius
if isOfCurrentUser {
delegate?.localVideoSetupWasRequested(videoView: videoView)
} else {
delegate?.remoteVideoSetupWasRequested(uid: uid, videoView: videoView)
}
}
And this is the conformance to the protocol:
extension RealTimeAudioAgoraService: NewVoiceRoomCellsDelegate {
func localVideoSetupWasRequested(videoView: UIView) {
guard !enabledVideoUIds.contains(currentUserRTCServiceId) else { return }
let videoCanvas = setUpVideoView(uid: currentUserRTCServiceId, videoView: videoView)
agoraKit?.setupLocalVideo(videoCanvas)
}
func remoteVideoSetupWasRequested(uid: UInt, videoView: UIView) {
guard !enabledVideoUIds.contains(uid) else { return }
let videoCanvas = setUpVideoView(uid: uid, videoView: videoView)
agoraKit?.setupRemoteVideo(videoCanvas)
}
private func setUpVideoView(uid: UInt, videoView: UIView) -> AgoraRtcVideoCanvas {
let videoCanvas = AgoraRtcVideoCanvas()
videoCanvas.uid = uid
videoCanvas.view = videoView
videoCanvas.renderMode = .hidden
// ------------------------------------
enabledVideoUIds.append(uid)
// ------------------------------------
return videoCanvas
}
}
As you can see, I keep track of all the uids I have enabled the video canvas for, so I do it only once for each user. The thing is something messes up the UI (see attached video) and even users with muted videos show the video canvas of others. It’s like the video canvas of each user is being cycled over every other user.
Any help will be much appreciated!
I am trying to support Apple's Markup of PDFs via UIDocumentInteractionController for files in my Documents folder on iPad. I want the documents edited in-place, so my app can load them again after the user is finished. I have set the Info.plist options for this, and the in-place editing does seem to work. Changes are saved to the same file.
When I bring up the UIDocumentInteractionController popover for the PDF, I am able to choose "Markup", which then shows the PDF ready for editing. I can edit it too. The problem is when I click "Done": I get a menu appear with the options "Save File To..." and "Delete PDF". No option just to close the editor or save.
The frustrating thing is, I can see via Finder that the file is actually edited in-place in the simulator, and is already saved when this menu appears. I just want the editor to disappear and not confuse the user. Ie I want "Done" to be "Done".
Perhaps related, and also annoying, is that while the markup editor is visible, there is an extra directory added to Documents called (A Document Being Saved By <<My App Name>>), and that folder is completely empty the whole time. Removing the folder during editing does not change anything.
Anyone have an idea if I am doing something wrong, or if there is a way to have the Done button simply dismiss?
In case others have this issue, I believe it is a bug in UIDocumentInteractionController in how it sets up the QLPreviewController it uses internally. If I proxy the delegate of the QLPreviewController, and return .updateContents from previewController(_:editingModeFor:), it works as expected.
Here is my solution. The objective is simple enough, but actually capturing the private QLPreviewController was not easy, and I ended up using a polling timer. There may be a better way, but I couldn't find it.
import QuickLook
class ViewController: UIViewController, UIDocumentInteractionControllerDelegate {
/// This wraps the original delegaet of the QLPreviewController,
/// so we can return .updateContents from previewController(_:editingModeFor:)
let delegateProxy = QLPreviewDelegateProxy()
var documentInteractionController = UIDocumentInteractionController()
/// A timer we use to update the QL controller
/// Ideally, we would use callbacks or delegate methds, but couldn't
/// find a satisfactory set to do the job. Instead we poll (like an animal)
var quicklookControllerPollingTimer: Timer?
/// Use this to track the preview controller created by UIDocumentInteractionController
var quicklookController: QLPreviewController?
/// File URL of the PDF we are editing
var editURL: URL!
override func viewDidDisappear(_ animated: Bool) {
super.viewDidDisappear(animated)
quicklookControllerPollingTimer?.invalidate()
}
#IBAction func showPopover(_ sender: Any?) {
documentInteractionController.url = editURL
documentInteractionController.delegate = self
documentInteractionController.presentOptionsMenu(from: button.bounds, in: button, animated: true)
quicklookControllerPollingTimer = Timer.scheduledTimer(withTimeInterval: 0.1, repeats: true) { [unowned self] timer in
guard quicklookController == nil else {
if quicklookController?.view.window == nil {
quicklookController = nil
}
return
}
if let ql = presentedViewController?.presentedViewController as? QLPreviewController, ql.view.window != nil {
self.quicklookController = ql
// Extra delay gives UI time to update
DispatchQueue.main.asyncAfter(deadline: .now()+0.1) {
guard let ql = self.quicklookController else { return }
delegateProxy.originalDelegate = ql.delegate
ql.delegate = delegateProxy
ql.reloadData()
}
}
}
}
}
class QLPreviewDelegateProxy: NSObject, QLPreviewControllerDelegate {
weak var originalDelegate: QLPreviewControllerDelegate?
/// All this work is just to return .updateContents here. Doing this makes it all work properly.
/// Must be a bug in UIDocumentInteractionController
func previewController(_ controller: QLPreviewController, editingModeFor previewItem: QLPreviewItem) -> QLPreviewItemEditingMode {
.updateContents
}
}
I try to present a CNContactPickerViewController inside a SwiftUI application using the UIViewControllerRepresentable protocol. As I already read, there seems to be a known issue for this not working, but I got it working quite ok using the workaround described here.
However, whenever the CNContactPickerViewController gets presented or dismissed resp., I get the following error in my output log:
[PPT] Error creating the CFMessagePort needed to communicate with PPT.
I tried to find explanations on this, but there seems to be no answer anywhere on the internet. Does someone know where this error comes from and what PPT is? Could this error have something to do with the CNContactPickerViewController not working properly with SwiftUI?
I noticed the error for the first time in the iOS 14 beta together with the Xcode 12 beta, and it is still present in iOS 14.2 with Xcode 12.2.
I don't know if the error appears on iOS 13 as well.
I already issued a feedback report about this.
I wrote a workaround using a hosting UINavigationController and here is my code:
import SwiftUI
import ContactsUI
struct ContactPickerView: UIViewControllerRepresentable {
#Environment(\.presentationMode) var presentationMode
func makeUIViewController(context: Context) -> UINavigationController {
let navController = UINavigationController()
let controller = CNContactPickerViewController()
controller.delegate = context.coordinator
navController.present(controller, animated: false, completion: nil)
return navController
}
func updateUIViewController(_ uiViewController: UINavigationController, context: Context) {
print("Updating the contacts controller!")
}
// MARK: ViewController Representable delegate methods
func makeCoordinator() -> ContactsCoordinator {
return ContactsCoordinator(self)
}
class ContactsCoordinator : NSObject, UINavigationControllerDelegate, CNContactPickerDelegate {
let parent: ContactPickerView
public init(_ parent: ContactPickerView) {
self.parent = parent
}
func contactPickerDidCancel(_ picker: CNContactPickerViewController) {
print("Contact picked cancelled!")
parent.presentationMode.wrappedValue.dismiss()
}
func contactPicker(_ picker: CNContactPickerViewController, didSelect contact: CNContact) {
print("Selected a contact")
parent.presentationMode.wrappedValue.dismiss()
}
}
}
And I use it like:
Button("Select a contact") {
openSelectContact.toggle()
}
.sheet(isPresented: $openSelectContact, onDismiss: nil) {
ContactPickerView()
}
I am developing a small app in Swift3 for iOS, I previously developed it for Android where I had a problem that is replicated here. It happens that I have a webview where you have to upload some videos, youtube videos are reproduced without problems (both Android and iOS), the problem arises when you want to play a video for example with JWPlayer, the problem I had in Android Was that the property was not enabled to write to disk and the solution was as follows:
final WebView myBrowser;
myBrowser.getSettings().setDomStorageEnabled(true);
What would be the solution for Swift3?
I found this solution:
var prefs: WebPreferences? = webView.prefereces
prefs?._setLocalStorageDatabasePath("~/Library/Application Support/MyApp")
prefs?.localStorageEnabled = true
But I get an error:
Use of undeclared type WebPreferences
I pay attention to your comments, greetings and many thanks in advance.
I update my question.
import UIKit
import WebKit
class ViewController: UIViewController {
#IBOutlet var webView: WebView!
override func viewDidLoad() {
super.viewDidLoad()
webView.loadUrl( string: "www.google.com")
}
class WebView: WKWebView{
required init?(coder: NSCoder) {
if let _view = UIView(coder: coder){
super.init(frame: _view.frame, configuration: WKWebViewConfiguration())
autoresizingMask = _view.autoresizingMask
}else{
return nil
}
}
func loadUrl(string: String){
if let url = URL(string: string ) {
load(URLRequest(url: url))
}
}
}
}
Is there a way to extend the Quick Look Framework on iOS to handle an unknown file type like on Mac? I don't want to have to switch to my app to preview the file, much like viewing image files in email or iMessage. I would like to remove the step of having to select what app to use to open the file.
On Mac they call it a Quick Look Generator, but I can't find a way to do it on iOS
This is how you use Quick Look Framework in iOS
Xcode 8.3.2. Swift 3
First goto Build Phases and add new framework QuickLook.framework under Link Binary with Libraries.
Next import QuickLook in your ViewController Class
Next set delegate method of QuickLook to your ViewController class to access all the methods of QUickLook.framework (see below).
class ViewController: UIViewController , QLPreviewControllerDataSource {
}
Next create instance of QLPreviewController in your class as below:
let quickLookController = QLPreviewController()
Now set datasource in your viewdidload method:
override func viewDidLoad() {
super.viewDidLoad()
quickLookController.dataSource = self
}
Now create an fileURLs array to store all the documents path which you need to pass later to QLPreviewController via delegate methods.
var fileURLs = [URL]()
Now add below methods to your class to tell QLPreviewController about your total number of documents.
func numberOfPreviewItemsInPreviewController(controller: QLPreviewController) -> Int {
return fileURLs.count
}
func previewController(_ controller: QLPreviewController, previewItemAt index: Int) -> QLPreviewItem {
return fileURLs[index] as QLPreviewItem
}
#available(iOS 4.0, *)
func numberOfPreviewItems(in controller: QLPreviewController) -> Int {
return fileURLs.count
}
Finally the method which shows your docs. You can also check if the type of document you want to Preview is possible to preview or not as below.
func showMyDocPreview(currIndex:Int) {
if QLPreviewController.canPreview(fileURLs[currIndex] as QLPreviewItem) {
quickLookController.currentPreviewItemIndex = currIndex
navigationController?.pushViewController(quickLookController, animated: true)
}
}
For now, if you want to show a preview of a file of a type not handled by the standard QLPreviewController, you have to write something yourself in your own app. You cannot write a custom Quick Look plugin like you can on the Mac.