Replacement for a custom CIFilter in iOS 12. - ios

Since iOS 12 CIColorKernel(source:"kernel string") is deprecated. Does anybody of you know what is Apples replacement for that?
I am searching for a custom CIFilter in Swift. Maybe there is a Open Source libary?

It was announced back at WWDC 2017 that custom filters can also be written with Metal Shading Language -
https://developer.apple.com/documentation/coreimage/writing_custom_kernels
So now apparently they are getting rid of Core Image Kernel Language altogether.
Here's a quick intro to writing a CIColorKernel with Metal -
https://medium.com/#shu223/core-image-filters-with-metal-71afd6377f4
Writing kernels with Metal is actually easier, the only gotcha is that you need to specify 2 compiler flags in the project (see the article above).

I attempted to follow along with these blog posts and the apple docs, but this integration between CoreImage and Metal is quite confusing. After much searching, I ended up creating an actual working example iOS app that demonstrates how to write a Metal kernel grayscale function and have it process the CoreImage pipeline.

You can use it like this:
let url = Bundle.main.url(forResource: "default", withExtension: "metallib")!
let data = try! Data(contentsOf: url)
let kernel = try! CIKernel(functionName: "monochrome", fromMetalLibraryData: data)
let sampler = CISampler(image: inputImage)
let outputImage = kernel.apply(extent: image.extent, roiCallback: { _, rect in rect }, arguments: [sampler])
According to Apple:
"You need to set these flags to use MSL as the shader language for a CIKernel. You must specify some options in Xcode under the Build Settings tab of your project's target. The first option you need to specify is an -fcikernel flag in the Other Metal Compiler Flags option. The second is to add a user-defined setting with a key called MTLLINKER_FLAGS with a value of -cikernel:

Related

[iOS][AvFoundation] Need to get the similar classes like MediaCodec in android

is there any class in iOS which returns encoder/ decoder capabilities just like Android MediaCodec/ MediaCodecList.(https://developer.android.com/reference/android/media/MediaCodec)
I need to get the fps/profile/level and width /height supported on each profile of h264 and hevc codec.
I have found it related to AvCaptureSession, but this may not be correct since we need to support AvPlayer only (and camera is not in the part of the flow.)
AVFoundation has very limited supported formats. Refer to this StackOverflow
So If somebody wants to query for the codec info, I recommend him/she use ffmpeg. To be specific, it's ffmpeg-kit. Check here for all API documentation.
I wrote you a sample about how to use it in iOS and query for codec info(or anything you want). Please check it out:
showcase
let mediaInfoSession = FFprobeKit.getMediaInformation(kSampleFilePath)
let mediaInfo = mediaInfoSession?.getMediaInformation()
let props = mediaInfo?.getAllProperties()
let duration = mediaInfo?.getDuration()
let bitRate = mediaInfo?.getBitrate()
...
Feel free to contact me.

iOS framework: MTLLibrary newFunctionWithName fails when cikernel flag is set on client code

I have an iOS framework using Metal; it sets up a pipeline descriptor as follows:
...
_metalVars.mtlLibrary = [_metalVars.mtlDevice newLibraryWithFile:libraryPath error:&error];
id <MTLFunction> vertexFunction = [_metalVars.mtlLibrary newFunctionWithName:#"vertexShader"];
id <MTLFunction> fragmentFunction = [_metalVars.mtlLibrary newFunctionWithName:#"fragmentShader"];
MTLRenderPipelineDescriptor *pipelineStateDescriptor = [[MTLRenderPipelineDescriptor alloc] init];
MTLRenderPipelineDescriptor *pipelineStateDescriptor = [[MTLRenderPipelineDescriptor alloc] init];
pipelineStateDescriptor.label = #"MyPipeline";
pipelineStateDescriptor.vertexFunction = vertexFunction;
pipelineStateDescriptor.fragmentFunction = fragmentFunction;
pipelineStateDescriptor.vertexDescriptor = mtlVertexDescriptor;
pipelineStateDescriptor.colorAttachments[0].pixelFormat = MTLPixelFormatRGBA8Unorm;
_metalVars.mtlPipelineState = [_metalVars.mtlDevice newRenderPipelineStateWithDescriptor:pipelineStateDescriptor error:&error];
...
This framework is used by an iOS app, and has been working fine until the flag
-fcikernel
was set under Other Metal Compiler Flags, and
-cikernel
underOther Metal Linker Flags, in the client app's Build Settings.
(flags set on client, per https://developer.apple.com/documentation/coreimage/cikernel/2880194-kernelwithfunctionname.)
With those changes in the client app, the above framework code snippet now fails at the last line of the above code snippet, on the call to newRenderPipelineStateWithDescriptor with the error
validateWithDevice:2556: failed assertion `Render Pipeline Descriptor Validation
vertexFunction must not be nil
and I've verified that the framework is now returning nil for the vertex function at
id <MTLFunction> vertexFunction = [_metalVars.mtlLibrary newFunctionWithName:#"vertexShader"];
but works fine if I remove the flags on the client app.
Questions:
1.) I've understood that metal shaders were precompiled; why would changing a build setting on the client code cause this runtime failure in the unchanged framework?
2.) How can this be resolved?
Those flags are meant to be used when compiling Core Image kernels written in Metal. They change the way the code is translated in a way that is not compatible with "regular" Metal shader code.
The problem with setting the Core Image flags on the project level is that all .metal files are affected—regardless of whether they contain Core Image kernels or regular shader code.
The best way around this is by using a different file extension for files containing Core Image Metal kernels, like .ci.metal. Then custom build rules can be used to compile those kernels separately from the rest of the Metal codebase.
I recommend this session from WWDC 2020 which describes the process in detail.

ARKit, metal shader for ARSCNView

Trying to figure out how to solve my issue of applying shaders to my ARSCNView.
Previously, when using a standard SCNView, i have successfully been able to apply a distortion shader the following way:
if let path = Bundle.main.path(forResource: "art.scnassets/distortion", ofType: "plist") {
if let dict = NSDictionary(contentsOfFile: path) {
let technique = SCNTechnique(dictionary: dict as! [String : AnyObject])
scnView.technique = technique
}
}
Replacing SCNView with ARSCNView gives me the following error(s):
"Error: Metal renderer does not support nil vertex function name"
"Error: _executeProgram - no pipeline state"
I was thinking it's because that ARSCNView uses a different renderer than SCNView. But logging ARSCNView.renderingAPI tells me nothing about the renderer, and i can't seem to choose one when i construct my ARSCNView instance. I must be missing something obvious, because i can't seem to find a single resource when scouring for references online.
My initial idea was instead use a SCNProgram to apply the shaders. But i can't find any resources of how to apply it to an ARSCNView, or if it's even a correct/possible solution, SCNProgram seems to be reserved for materials.
Anyone able to give me any useful pointers of how to solve vertex+fragment shaders for ARSCNView?
SCNTechnique for ARSCNView does not work with GLSL shaders, instead Metal functions need to be provided in the technique's plist file under the keys metalVertexShader and metalFragmentShader.
To the contrary, documentation says any combination of shader should work:
You must specify both fragment and vertex shaders, and you must
specify either a GLSL shader program, a pair of Metal functions, or
both. If both are specified, SceneKit uses whichever shader is
appropriate for the current renderer.
So it might be a mistake, but I guess the documentation is outdated. Since all ARKit running devices also run Metal, GLSL support has not been added to ARSCNViews.
As iOS12 deprecates OpenGL this looks like planned.
I had this issue in ARKit iOS11.4 and 12 and it came down to a series of miss-spelt shaders. I hope this might help someone.

Can I turn a string into a block of code in swift?

Is there any way to turn a string into a block of code? I'm making an Ajax request to a website of mine that has an endpoint that returns some swift code as a string. I can get that code back as a string, but I can't run that code because it doesn't know that it is code.
As others have pointed out, if you are creating an iOS app (especially for distribution on the app store), you can not do this. However, if you are writing Swift code for an OS X machine AND you know that XCode is installed on the machine, you can run your Swift code string by running the command-line Swift compiler. Something like this (with proper error checking, of course):
var str = "let str = \"Hello\"\nprintln(\"\\(str) world\")\n"
let task = Process()
task.launchPath = "/usr/bin/swift"
let outpipe = Pipe()
let inpipe = Pipe()
inpipe.fileHandleForWriting.write(str.data(using: String.Encoding.utf8, allowLossyConversion: true)!)
task.standardInput = inpipe
task.standardOutput = outpipe
task.launch()
task.waitUntilExit()
task.standardInput = Pipe()
let data = outpipe.fileHandleForReading.readDataToEndOfFile()
let output = NSString(data: data, encoding: String.Encoding.utf8.rawValue)! as String
Again, this is probably not recommended in nearly all real-world cases, but is a way you can execute a String of Swift code, if you really need to.
No, you can't do that. Swift is a compiled language, not interpreted like Ajax.
The Swift compiler runs on your Mac, not on the iOS device. (The same is true for Objective-C).
Plus, Apple's app store guidelines forbid delivering executable code to your apps, so even if you figured out a way to do it, your app would be rejected.
Edit:
Note that with the advent of Swift playgrounds, it is possible to run the Swift compiler on an iPad. Recent high-end iPhones are probably also up to the job, but you'd have to figure out how to get it installed.
As stated above though, Apple's app store guidelines forbid you from delivering code to your apps at runtime.

Not Getting QR Code Data Using AVFoundation Framework

I used AVFoundation framework delegate methods to read QR Code.It is reading almost all QR codes & giving resulted data for them. But when I try with some QR code(eg. below QR image) , it is predicting that it is QR code but does not give any data for it.
Your sample is triggering an internal (C++) exception.. it seems to be getting caught around [AVAssetCache setMaxSize:] which suggests either the data in this particular sample is corrupt, or it's just to large for AVFoundation to handle.
As it's an internal exception it is (mostly) failing silently. The exception occurs when you try to extract the stringValue from your AVMetadataMachineReadableCodeObject.
So if you test for the existence of your AVMetadataMachineReadableCodeObject, you will get YES, whereas if you test for stringValue you will get NO.
AVMetadataMachineReadableCodeObject *readableObject =
(AVMetadataMachineReadableCodeObject *)[self.previewLayer
transformedMetadataObjectForMetadataObject:metadataObject];
BOOL foundObject = readableObject != nil;
//returns YES
BOOL foundString = readableObject.stringValue != nil;
//returns NO + triggers internal exception
It's probably best to test for the string, rather than the object, and ignore any result that returns NO.
update
In your comment you ask about native framework solution that will read this barcode. AVFoundation is the native framework for barcode reading, so if it fails on your sample, you will have to look for third-party solutions.
zxing offers an iOS port but it looks to be old and unsupported.
zbarSDK used to be a good solution but also seems to be unsupported past ios4. As AVFoundation now has built-in barcode reading, this is unsurprising.
This solution by Accusoft does read the sample but is proprietary and really pricey.
I do wonder about the content of you sample though - it looks either corrupt or some kind of exotic encoding...

Resources