Metal API, GetBytes Validation rowBytes(1600) must be >= (4680) - ios

Metal experts!
I'm struggling with the following error:
failed assertion `GetBytes Validation rowBytes(1600) must be >= (4680)
at
texture.getBytes(pixelBufferBytes, bytesPerRow: bytesPerRow, from: region, mipmapLevel: 0)
when I try to record the MTLTexture as a video frame using AVFoundation. Therefore I'm converting the MTLTexture into the CVPixelBuffer using the following SO answer (see VideoRecorder.writeFrame).
Here is the place where I pass the texture into the recorder:
extension MetalRenderer: MTKViewDelegate {
func mtkView(_: MTKView, drawableSizeWillChange _: CGSize) {}
func draw(in view: MTKView) {
guard let drawable = view.currentDrawable, let descriptor = view.currentRenderPassDescriptor else {
return
}
let commandBuffer = commandQueue.makeCommandBuffer()!
let commandEncoder = commandBuffer.makeRenderCommandEncoder(descriptor: descriptor)!
let deltaTime = 1 / Float(view.preferredFramesPerSecond)
scene?.render(commandEncoder: commandEncoder, deltaTime: deltaTime)
commandEncoder.endEncoding()
commandBuffer.present(drawable)
commandBuffer.commit()
commandBuffer.addCompletedHandler { commandBuffer in
let texture = drawable.layer.nextDrawable()?.texture
recorder.writeFrame(forTexture: texture!)
}
}
}
I suppose, something could be wrong with addCompletedHandler or grabbing the texture maybe?
Workarounds
Workaround 1
I tried to disable the Metal API Validation in Run Scheme and got a bunch of the following errors, but not sure how relevant they are:
2023-01-29 23:03:50.009525+0100 [Client] Synchronous remote object proxy returned error: Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service named com.apple.commcenter.coretelephony.xpc was invalidated: failed at lookup with error 3 - No such process." UserInfo={NSDebugDescription=The connection to service named com.apple.commcenter.coretelephony.xpc was invalidated: failed at lookup with error 3 - No such process.}
Workaround 2
Also, I tried to set the metalView.framebufferOnly to false, which didn't help to resolve the crash.
I'd appreciate it if someone could answer. Thanks in advance!

The error message failed assertion `GetBytes Validation rowBytes(1600) must be >= (4680) says that the number of bytes in the row of the CVPixelBuffer obtained from the MTLTexture is not enough to hold the entire row of data in the texture.
So matching the CVPixelBuffer dimensions with the MTLTexture will solve the issue.

Related

iOS Metal Invalid device store executing vertex function "myFunction" encoder: draw: offset:

I have a project using Metal rendering, it is working fine when I pass one metal buffer to the shader function. However, when I pass a different buffer, I get the following errors:
(IOAF code 11)
with "Shader validation enabled" scheme diagnostic flag:
Invalid device store executing vertex function "myFunction" encoder: draw: offset:
Both buffers are initialized using the same function. The lifecycle of the first object holding the buffer is much longer than the second one.
Is the error caused by the object holding the buffer being deallocated?
/// Initializes the buffer with zeros, the buffer is given an appropriate length based on the provided element count.
init(device: MTLDevice, count: Int, index: UInt32, label: String? = nil, options: MTLResourceOptions = []) {
guard let buffer = device.makeBuffer(length: MemoryLayout<Element>.stride * count, options: options) else {
fatalError("Failed to create MTLBuffer.")
}
self.buffer = buffer
self.buffer.label = label
self.count = count
self.index = Int(index)
}
//usage:
renderEncoder.setVertexBuffer(/*one of two buffers*/)
The issue was caused by the shader function trying to write the output of it's calculation at a wrong index. As long as the shader references the buffer from 0 to count, everything works with either buffer. The error was caused by the shader trying to reference the buffer using a wrong index, and stepping out of range of the buffer.

QLThumbnailGenerator starts failing when called multiple times (on actual device) iOS 13

I am trying to create thumbnail images of multiple Wallet Passes (.pkpass) by running a loop over all (around 200) passes in a specific folder and calling generateBestRepresentation(for:) for each of them.
This is the code:
let passesDirURL = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first!.appendingPathComponent("Passes")
let size = CGSize(width: 1600, height: 1600)
let scale = UIScreen.main.scale
if let passURLs = try? FileManager.default.contentsOfDirectory(
at: self.passesDirURL,
includingPropertiesForKeys: nil,
options: .skipsHiddenFiles
),
!passURLs.isEmpty {
for passURL in passURLs {
// Create the thumbnail request.
let request = QLThumbnailGenerator.Request(
fileAt: passURL,
size: size,
scale: scale,
representationTypes: .thumbnail
)
// Retrieve the singleton instance of the thumbnail generator and generate the thumbnails.
let generator = QLThumbnailGenerator.shared
generator.generateBestRepresentation(for: request) { thumbnail, error in
if let error = error as? QLThumbnailError {
print ("Thumbnail generation error: \(error)")
print ("Thumbnail generation localizedDescription: \(error.localizedDescription)")
print ("Thumbnail generation errorUserInfo: \(error.errorUserInfo)")
print ("Thumbnail generation errorCode: \(error.errorCode)")
} else {
print ("Thumbnail generation OK")
//do something with thumbnail here
}
}
}
}
This works fine on the simulator, but on an actual device (iPhone Xs Max) sooner or later I start getting errors and the thumbnail generation fails for a big fraction of the passes. The output looks as follows:
Thumbnail generation error: related decl 'e' for QLThumbnailError(_nsError: Error Domain=QLThumbnailErrorDomain Code=3 "No thumbnail in the cloud for file:///private/var/mobile/Containers/Data/Application/DCF703F7-9A1A-4340-86EB-42579D678EEF/Documents/Passes/pass123.pkpass" UserInfo={NSErrorFailingURLKey=file:///private/var/mobile/Containers/Data/Application/DCF703F7-9A1A-4340-86EB-42579D678EEF/Documents/Passes/pass123.pkpass})
Thumbnail generation localizedDescription: The operation couldn’t be completed. (QLThumbnailErrorDomain error 3.)
Thumbnail generation errorUserInfo: ["NSErrorFailingURLKey": file:///private/var/mobile/Containers/Data/Application/DCF703F7-9A1A-4340-86EB-42579D678EEF/Documents/Passes/pass123.pkpass]
Thumbnail generation errorCode: 3
The error description sounds confusing ("No thumbnail in the cloud for file") as these are not iCloud files.
As the error does not occur when calling the thumbnail generation individually, this seems to be some memory/performance issue. I tried to workaround in many ways, including using a semaphore in the for loop waiting for the completion of one call of generateBestRepresentation to start the next call, which reduced but not eliminated the issue. The only way it worked without error was adding a very long sleep (5 seconds) after the semaphore.wait() statement, but this is no acceptable solution.
Another way I tried was using saveBestRepresentation (as suggested in Apple's documentation), but this did not solve the issue.
Has anyone faced a similar issue and was able to find an acceptable solution?

Presentation is in process issue following Error

I am playing with Vision for text recognition, I present the camera, take a photo and the text is detected and processed...running very well. The issue I have is when there is no text in the photo, I get an error from VNImageRequestHandler, which is fine, but the issue is that I can't re-open the camera, I get "Warning: Attempt to present UIImagePickerController: ... while a presentation is in progress!.
here is some code where I process the image looking for some text...
guard let image = image, let cgImage = image.cgImage else { return }
let requests = [textDetectionRequest]
let imageRequestHandler = VNImageRequestHandler(cgImage: cgImage, orientation: .up, options: [:] )
DispatchQueue.global(qos: .userInitiated).async {
do {
try imageRequestHandler.perform(requests)
} catch let error {
print("Error: \(error)")
}
}
}
the Error is
"Error: Error Domain=com.apple.vis Code=11 "encountered unknown
exception" UserInfo={NSLocalizedDescription=encountered unknown
exception}"
which is fine, I just want to be able to open the UIImagePickerController after that Error.
I have tried to dismiss the UIImagePickerController, does not work... and I can't find what presentation is really in process.
Thanks.
For me it was something completely unrelated that caused this error. After the VNRequestCompletionHandler was called, I attempted to initialize a malformed NSPredicate. Then I would get the error that you described. Fixing the predicate also fixed the issue you described.
I would look to see if there's any work you're doing after the completion handler is called that can throw an error and fix that.

AVFoundation captureOutput didOutputSampleBuffer Delay

I am using AVFoundation captureOutput didOutputSampleBuffer to extract an image then to be used for a filter.
self.bufferFrameQueue = DispatchQueue(label: "bufferFrame queue", qos: DispatchQoS.background, attributes: [], autoreleaseFrequency: .inherit)
self.videoDataOutput = AVCaptureVideoDataOutput()
if self.session.canAddOutput(self.videoDataOutput) {
self.session.addOutput(videoDataOutput)
self.videoDataOutput!.alwaysDiscardsLateVideoFrames = true
self.videoDataOutput!.videoSettings = [kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_32BGRA)]
self.videoDataOutput!.setSampleBufferDelegate(self, queue: self.bufferFrameQueue)
}
func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {
connection.videoOrientation = .portrait
let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!
let ciImage = CIImage(cvPixelBuffer: pixelBuffer)
DispatchQueue.main.async {
self.cameraBufferImage = ciImage
}
}
Above just updates self.cameraBufferImage anytime there's a new output sample buffer.
Then, when a filter button is pressed, I use self.cameraBufferImage as this:
func filterButtonPressed() {
if var inputImage = self.cameraBufferImage {
if let currentFilter = CIFilter(name: "CISepiaTone") {
currentFilter.setValue(inputImage, forKey: "inputImage")
currentFilter.setValue(1, forKey: "inputIntensity")
if let output = currentFilter.outputImage {
if let cgimg = self.context.createCGImage(output, from: inputImage.extent) {
self.filterImageLayer = CALayer()
self.filterImageLayer!.frame = self.imagePreviewView.bounds
self.filterImageLayer!.contents = cgimg
self.filterImageLayer!.contentsGravity = kCAGravityResizeAspectFill
self.imagePreviewView.layer.addSublayer(self.filterImageLayer!)
}
}
}
}
}
When above method is invoked, it grabs the 'current' self.cameraBufferImage and use it to apply the filter. This works fine in normal exposure duration times (below 1/15 seconds or so...)
Issue
When exposure duration is slow, i.e. 1/3 seconds, it takes a awhile (about 1/3 seconds) to apply the filter. This delay is only present upon the first time after launch. If done again, there is no delay at all.
Thoughts
I understand that if exposure duration is 1/3 seconds, didOutputSampleBuffer only updates every 1/3 seconds. However, why is that initial delay? Shouldn't it just grab whatever self.cameraBufferImage available at that exact time, instead of waiting?
Queue issue?
CMSampleBuffer retain issue? (Although on Swift 3, there is no CFRetain)
Update
Apple's Documentation
Delegates receive this message whenever the output captures and
outputs a new video frame, decoding or re-encoding it as specified by
its videoSettings property. Delegates can use the provided video frame
in conjunction with other APIs for further processing.
This method is called on the dispatch queue specified by the output’s
sampleBufferCallbackQueue property. It is called periodically, so it
must be efficient to prevent capture performance problems, including
dropped frames.
If you need to reference the CMSampleBuffer object outside of the
scope of this method, you must CFRetain it and then CFRelease it when
you are finished with it.
To maintain optimal performance, some sample buffers directly
reference pools of memory that may need to be reused by the device
system and other capture inputs. This is frequently the case for
uncompressed device native capture where memory blocks are copied as
little as possible. If multiple sample buffers reference such pools of
memory for too long, inputs will no longer be able to copy new samples
into memory and those samples will be dropped.
If your application is causing samples to be dropped by retaining the
provided CMSampleBuffer objects for too long, but it needs access to
the sample data for a long period of time, consider copying the data
into a new buffer and then releasing the sample buffer (if it was
previously retained) so that the memory it references can be reused.

Crashing error after cleaning up after saving FBO to camera roll? Swift 2.0 selector syntax

So I have this code to save a currently bound FBO to the camera roll. The first part of this code works perfectly! If I dont try to clean up the buffer or image reference everything works fine, and a picture is placed inside of the camera roll. Unfortunately there is a 4mb memory leak as a result.
So apparently I need to clean up some of the data.
The first place I thought to look was my var buffer = UnsafeMutablePointer<GLubyte>(nil) the problem is that if you clear that right after the UIImageWriteToSavedPhotosAlbum call you get a really odd crashing error with no stack trace that makes sense.
So I figure that it takes time for the data to save to the photo album as such I need to use a completion selector. The problem is I have tried a couple different ways of using the selector block but every time I get a crash and a message from "NSForwarding" in this case I get:
NSForwarding: warning: object 0x16e6bb60 of class 'App.ScreenshotSaving' does not implement methodSignatureForSelector: -- trouble ahead
Unrecognized selector -[App.ScreenshotSaving methodSignatureForSelector:]
For the reference this class is instantiated inside of a static class like so
class Storage
{
static var ssave = ScreenshotSaving()
}
and as such when its time to take a screenshot Storage.ssave.saveScreenshot() is called.
import Foundation
import GLKit
import OpenGLES
import Fabric
class ScreenshotSaving
{
var myImage = UIImage()
var buffer = UnsafeMutablePointer<GLubyte>(nil)
func saveScreenshot()
{
var width:GLint = 0
var height:GLint = 0
glGetRenderbufferParameteriv(GLenum(GL_RENDERBUFFER), GLenum(GL_RENDERBUFFER_WIDTH), &width)
glGetRenderbufferParameteriv(GLenum(GL_RENDERBUFFER), GLenum(GL_RENDERBUFFER_HEIGHT), &height)
let mdl:Int = Int(width * height * 4)
buffer = UnsafeMutablePointer<GLubyte>(malloc(Int(mdl)))
glReadPixels(0, 0, width, height, GLenum(GL_RGBA), GLenum(GL_UNSIGNED_BYTE), buffer)
let provider = CGDataProviderCreateWithData(nil, buffer, mdl, nil)
let bitsPerComponent:Int = 8
let bitsPerPixel:Int = 32
let bytesPerRow:Int = 4 * Int(width)
let colorSpace = CGColorSpaceCreateDeviceRGB()
let bitmapInfo:CGBitmapInfo = CGBitmapInfo(rawValue: 0 << 12)
let renderIntent = CGColorRenderingIntent.RenderingIntentDefault
let imageRef = CGImageCreate(Int(width), Int(height), bitsPerComponent, bitsPerPixel, bytesPerRow, colorSpace, bitmapInfo, provider, nil, false, renderIntent)
FabricI.crashLog("Save screenshot: Finished image ref")
myImage = UIImage(CGImage: imageRef!)
UIImageWriteToSavedPhotosAlbum(myImage, self, #selector(ScreenshotSaving.finishedPic), nil)
}
#objc func finishedPic()
{
myImage = UIImage()
free(buffer)
}
}
One more question, when the photo is saved to the photo album is it compressed like a regular image or will it be the same size as the raw data?
Your current crash is because your class isn't a subclass of NSObject, so it doesn't know how to find the target method (func).
I see you already tried that in the comments. The crash when you try that is because the selector has specific requirements in this case. The method signature for the selector must match (the form of):
- (void)image: (UIImage *) image didFinishSavingWithError: (NSError *) error contextInfo: (void *) contextInfo
The name can be different but it must take 3 parameters with those types.
func image(image: UIImage, didFinishSavingWithError error: NSError?, contextInfo:Un

Resources