ARSKView snapshot iOS - ios

Has anyone had any luck with taking screenshots of the ARSKView in ARKit projects?
I tried something like this:
if let sc = view.snapshotView(afterScreenUpdates: true) {
DispatchQueue.main.async {
UIGraphicsBeginImageContextWithOptions(sc.bounds.size, true, 0.0)
sc.drawHierarchy(in: sc.bounds, afterScreenUpdates: true)
let screenshot = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
UIImageWriteToSavedPhotosAlbum(screenshot!, nil, nil, nil)
}
}
But the output is a black image.
On Apple's example, they are using ARSCNView, where they can call a snapshot function and it does everything automatically.
I'm using SpriteKit not SceneKit, so in spriteKit there is no snapshot method.

I also got the black screen when using drawHeirarchy(in:afterScreenUpdates:) on the ARSKView
Another approach that failed for me: Using the ARSession property (ARFrame *)currentFrame - converting the pixel buffer capturedImage to a UIImage but this just shows the image of the real world, it doesn't include the sprites.
What worked for me was just taking a screenshot from the View Controller view instead of the ARSKView.
It's still a possibility to just switch to SceneKit since you can still use SpriteKit as a material on an SCNNode. SceneKit comes with nice stuff like snapshot as well.
There are also open-source implementations like ARVideoKit https://github.com/AFathi/ARVideoKit as well

Related

ARSCNView snapshot() causes latency

I'm taking a snapshot of every frame, applying a filter, and updating the background contents of the ARSCNView with the filtered image. Everything is working fine, but there is a lot of latency with all the UI elements on the screen. No latency on the ARSCNView.
func session(_ session: ARSession, didUpdate frame: ARFrame) {
guard let image = CIImage(image: sceneView.snapshot()) else { return }
// I'm setting a filter to each image here. Which has no effect on the latency.
sceneView.scene.background.contents = context.createCGImage(image, from: image.extent)
}
I know I can use frame.capturedImage, which makes latency go away. However, I also place AR objects on the screen which frame.capturedImage ignores for some reason, and sceneView.scene.background.contents cannot be reset to its original source. So, I cannot turn off the image filter. That's why I need to take a snapshot.
Is there anything I can do that will reduce latency on the UI elements? I have a few UIScrollViews on the screen that have tremendous lag.
I'm also in the middle of looking for a way to do this with no lag, but I was able to at least reduce the lag by rendering the view into an image manually:
extension ARSCNView {
/// Performs screen snapshot manually, seems faster than built in snapshot() function, but still somewhat noticeable
var snapshot: UIImage? {
let renderer = UIGraphicsImageRenderer(size: self.bounds.size)
let image = renderer.image(actions: { context in
self.drawHierarchy(in: self.bounds, afterScreenUpdates: true)
})
return image
}
}
It's frustrating that this is faster than the built-in snapshot function, but it seems to be, and also still captures all the SceneKit graphics in the snapshot. (Doing this every frame will still be expensive though, FYI, and the only real solution for that would likely be a custom Metal shader.)
I'm also trying to work with ARSCNView.snapshotView(afterScreenUpdates: Bool) because that seems to have essentially no lag for my purposes, but whenever I try to turn the resulting View into a UIImage, it's totally blank. Either way, the above method cut the lag in about half for me, so you might have some luck with that.

Adding UIView Laggy on Specific iPad Models

I have a pdf viewer for sheet music, which is based on PDFKit. PDFKit has an option to use an internal UIPageViewController, but it is very problematic - you cannot set the transition type, and worse than that, there is no way to check whether a page swipe succeeded or failed. You end up seeing one page, while the reported page index is another one.
Therefore I decided to create my own page flipping method. I added a UITapGestureRecognizer, and when the right or left edges are tapped, the page flips programmatically. To achieve curl animation, I add a UIView with the same image of what's underneath it, do the curl animation to the PDFView, and then remove the view. Here is part of the code:
// Function to flip pages with page curl
func flipPage (direction: String) {
let renderer = UIGraphicsImageRenderer(size: pdfView.bounds.size)
let image = renderer.image { ctx in
pdfView.drawHierarchy(in: pdfView.bounds, afterScreenUpdates: true)
}
let imageView = UIImageView(image: image)
imageView.frame = pdfView.frame
imageView.tag = 830
self.view.addSubview(imageView)
self.view.bringSubviewToFront(imageView)
if direction == "forward" && pdfView.canGoToNextPage() {
pdfView.goToNextPage(nil)
let currentImageView = self.view.subviews.filter({$0.tag == 830})
if currentImageView.count > 0 {
UIView.transition(from: currentImageView[0],
to: pdfView, duration: 0.3,
options: [.transitionCurlUp, .allowUserInteraction],
completion: {finished in
currentImageView[0].removeFromSuperview()
})
}
}
Now comes the weird part. On my own iPad Pro 12.9 inches 1st generation, this method of flipping is blazing fast. No matter the build configuration or optimization level, it simply works. If I tap in a fast succession, the pages flip as fast as I tap.
I have users with the 2nd gen iPad Pro 12.9, and they experience a terrible lag when the UIView is drawn on top of the PDFView. This also happens on all build configurations - it happened with a release build, and also happened when I installed a debug build from my computer on such a device (sadly, I could not keep the device to explore things further).
There are several other instances in the app in which I add a UIView on top - to add a semi-transparent veil, or to capture UIGestureRecognizers. On my own device, these are all very fast. On the iPad 2nd gen, each and every one causes a lag. Incidentally, a user with a 3rd gen iPad Pro reported that the performance was very fast on his device, without any lags. On the simulator the animation is sometimes incomplete, but the response is as fast as it should be - for all iPad models.
I searched for answers, and found absolutely no references to such a weird situation. Has anyone experienced anything like this? Any quick fixes, or noticeable problems in the logic of my code?
I am afraid that if I try to draw the custom UIViews ahead of time, and only bring them to the front when needed, I'll end up with a ridiculously large amount of UIViews in the background, and simply move the delay elsewhere.
After doing a bit of research, I can provide a solution for people who face similar issues. The problem appears to be scheduling.
I still do not know why the 2017 models schedule their threads differently. Any ideas about why this problem reared its head in the first place is welcome. However -
I was not, in fact, following the best practice. Changes to the UI should always happen in the main thread, so if you encounter a lag like this, encapsulate the actual adding and removing of the UIView like this:
DispatchQueue.main.async {
self.view.addSubview(imageView)
self.view.bringSubviewToFront(imageView)
}
My users report the problem just vanished after that.
EDIT
Be sure to include both adding the UIView and the animation block to the same DispatchQueue segment, otherwise they will compete for the execution slot. My final code looks like this:
func flipPage (direction: String) {
let renderer = UIGraphicsImageRenderer(size: pdfView.bounds.size)
let image = renderer.image { ctx in
pdfView.drawHierarchy(in: self.pdfView.bounds, afterScreenUpdates: true)
let imageView = UIImageView(image: image)
imageView.frame = pdfView.frame
if direction == "forward" && pdfView.canGoToNextPage() {
DispatchQueue.main.async {
self.view.addSubview(imageView)
self.view.bringSubviewToFront(imageView)
self.pdfView.goToNextPage(nil)
UIView.transition(from: imageView,
to: self.pdfView, duration: 0.3,
options: [.transitionCurlUp, .allowUserInteraction],
completion: {finished in
imageView.removeFromSuperview()
})
}
}
P.S. If possible, avoid using drawHierarchy - it is not a very fast method.
In any case, if you need to code differently for specific devices, check out DeviceKit. A wonderful project, that gives you the simplest interface possible.

Is there a way to display camera images without using AVCaptureVideoPreviewLayer?

Is there a way to display camera images without using AVCaptureVideoPreviewLayer?
I want to do screen capture, but I can not do it.
session = AVCaptureSession()
camera = AVCaptureDevice.default(
AVCaptureDevice.DeviceType.builtInWideAngleCamera,
for: AVMediaType.video,
position: .front) // position: .front
do {
input = try AVCaptureDeviceInput(device: camera)
} catch let error as NSError {
print(error)
}
if(session.canAddInput(input)) {
session.addInput(input)
}
let previewLayer = AVCaptureVideoPreviewLayer(session: session)
cameraView.backgroundColor = UIColor.red
previewLayer.frame = cameraView.bounds
previewLayer.videoGravity = AVLayerVideoGravity.resizeAspect
cameraview.layer.addSublayer(previewLayer)
session.startRunning()
I am currently trying to broadcast a screen capture. It is to synthesize the camera image and some UIView. However, if you use AVCaptureVideoPreviewLayer screen capture can not be done and the camera image is not displayed. Therefore, I want to display the camera image so that screen capture can be performed.
Generally the views that are displayed using GPU directly may not be redrawn on the CPU. This includes situations like openGL content or these preview layers.
The "screen capture" redraws the screen on a new context on CPU which obviously ignores the GPU part.
You should try and play around with adding some outputs on the session which will give you images or rather CMSampleBuffer shots which may be used to generate the image.
There are plenty ways in doing this but you will most likely need to go a step lower. You can add output to your session to receive samples directly. Doing this is a bit of a code so please refer to some other posts like this one. The point in this you will have a didOutputSampleBuffer method which will feed you CMSampleBufferRef objects that may be used to construct pretty much anything in terms of images.
Now in your case I assume you will be aiming to get UIImage from sample buffer. To do so you may again need a bit of code so refer to some other post like this one.
To put it all together you could as well simply use an image view and drop the preview layer. As you get the sample buffer you can create image and update your image view. I am not sure what the performance of this would be but I discourage you on doing this. If image itself is enough for your case then you don't need a view snapshot at all.
But IF you do:
On snapshot create this image. Then overlay your preview layer with and image view that is showing this generated image (add a subview). Then create the snapshot and remove the image view all in a single chunk:
func snapshot() -> UIImage? {
let imageView = UIImageView(frame: self.previewPanelView.bounds)
imageView.image = self.imageFromLatestSampleBuffer()
imageView.contentMode = .aspectFill // Not sure
self.previewPanelView.addSubview(imageView)
let image = createSnapshot()
imageView.removeFromSuperview()
return image
}
Let us know how things turn and you tried, what did or did not work.

Loose quality when render UIImageView in swift

I am using below code in swift to capture a UIImageView into one image. It works but the image is not as same quality as the one showing on the UIImageView. Is there a way to configure the quality when capture this image?
private func getScreenshow(imageView:UIImageView) -> UIImage{
UIGraphicsBeginImageContext(self.imageView.frame.size)
let context = UIGraphicsGetCurrentContext()
imageView.layer.renderInContext(context!)
let screenShot = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
UIImageWriteToSavedPhotosAlbum(screenShot, nil, nil, nil)
return screenShot
}
After some searching I figured out the issue. I use below code to replace the "UIGraphicsBeginImageContext" and it works.
UIGraphicsBeginImageContextWithOptions(self.imageView.frame.size, true, 0)
This code looks pretty weird (why don’t you just use imageView.image?) but I don’t know the full context of your use case.
As you found, the reason for the loss of quality is you are ignoring the screen’s retina scale.
Read the documentation for UIGraphicsBeginImageContext and UIGraphicsBeginImageContextWithOptions and you’ll see the former uses a ‘scale factor of 1.0’.

Text View Screenshot in Swift?

I found a source code about view screenshot. I changed it a little and tried. But this code has a little problem. Screenshot resolution is really bad. I need a good resolution screenshot. I tried to add a comment, but I'm new on stackoverflow. Anyway, what can I do for this ?
Link : Screenshot in swift iOS?
My code :
func textViewSS() {
//Create the UIImage
UIGraphicsBeginImageContext(textView.frame.size)
textView.layer.renderInContext(UIGraphicsGetCurrentContext())
let image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
//Save it to the camera roll
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil)
}
Sample Result :
http://i60.tinypic.com/s4wdn4.png
Trying modifying the first line in your code to pass the scale if you are not satisfied with the resolution
UIGraphicsBeginImageContextWithOptions(textView.frame.size, false, UIScreen.mainScreen().scale)
I don't know the requirement in your case but drawViewHierarchyInRect is quicker/cheaper than renderInContext. You may want to consider that if it is applicable.

Resources