ARKit, metal shader for ARSCNView - ios

Trying to figure out how to solve my issue of applying shaders to my ARSCNView.
Previously, when using a standard SCNView, i have successfully been able to apply a distortion shader the following way:
if let path = Bundle.main.path(forResource: "art.scnassets/distortion", ofType: "plist") {
if let dict = NSDictionary(contentsOfFile: path) {
let technique = SCNTechnique(dictionary: dict as! [String : AnyObject])
scnView.technique = technique
}
}
Replacing SCNView with ARSCNView gives me the following error(s):
"Error: Metal renderer does not support nil vertex function name"
"Error: _executeProgram - no pipeline state"
I was thinking it's because that ARSCNView uses a different renderer than SCNView. But logging ARSCNView.renderingAPI tells me nothing about the renderer, and i can't seem to choose one when i construct my ARSCNView instance. I must be missing something obvious, because i can't seem to find a single resource when scouring for references online.
My initial idea was instead use a SCNProgram to apply the shaders. But i can't find any resources of how to apply it to an ARSCNView, or if it's even a correct/possible solution, SCNProgram seems to be reserved for materials.
Anyone able to give me any useful pointers of how to solve vertex+fragment shaders for ARSCNView?

SCNTechnique for ARSCNView does not work with GLSL shaders, instead Metal functions need to be provided in the technique's plist file under the keys metalVertexShader and metalFragmentShader.
To the contrary, documentation says any combination of shader should work:
You must specify both fragment and vertex shaders, and you must
specify either a GLSL shader program, a pair of Metal functions, or
both. If both are specified, SceneKit uses whichever shader is
appropriate for the current renderer.
So it might be a mistake, but I guess the documentation is outdated. Since all ARKit running devices also run Metal, GLSL support has not been added to ARSCNViews.
As iOS12 deprecates OpenGL this looks like planned.

I had this issue in ARKit iOS11.4 and 12 and it came down to a series of miss-spelt shaders. I hope this might help someone.

Related

Replacement for a custom CIFilter in iOS 12.

Since iOS 12 CIColorKernel(source:"kernel string") is deprecated. Does anybody of you know what is Apples replacement for that?
I am searching for a custom CIFilter in Swift. Maybe there is a Open Source libary?
It was announced back at WWDC 2017 that custom filters can also be written with Metal Shading Language -
https://developer.apple.com/documentation/coreimage/writing_custom_kernels
So now apparently they are getting rid of Core Image Kernel Language altogether.
Here's a quick intro to writing a CIColorKernel with Metal -
https://medium.com/#shu223/core-image-filters-with-metal-71afd6377f4
Writing kernels with Metal is actually easier, the only gotcha is that you need to specify 2 compiler flags in the project (see the article above).
I attempted to follow along with these blog posts and the apple docs, but this integration between CoreImage and Metal is quite confusing. After much searching, I ended up creating an actual working example iOS app that demonstrates how to write a Metal kernel grayscale function and have it process the CoreImage pipeline.
You can use it like this:
let url = Bundle.main.url(forResource: "default", withExtension: "metallib")!
let data = try! Data(contentsOf: url)
let kernel = try! CIKernel(functionName: "monochrome", fromMetalLibraryData: data)
let sampler = CISampler(image: inputImage)
let outputImage = kernel.apply(extent: image.extent, roiCallback: { _, rect in rect }, arguments: [sampler])
According to Apple:
"You need to set these flags to use MSL as the shader language for a CIKernel. You must specify some options in Xcode under the Build Settings tab of your project's target. The first option you need to specify is an -fcikernel flag in the Other Metal Compiler Flags option. The second is to add a user-defined setting with a key called MTLLINKER_FLAGS with a value of -cikernel:

Mixed topology (quad/tri) with ModelIO

I'm importing some simple OBJ assets using ModelIO like so:
let mdlAsset = MDLAsset(url: url, vertexDescriptor: nil, bufferAllocator: nil, preserveTopology: true, error: nil)
... and then adding them to a SceneKit SCN file. But, whenever I have meshes that have both quads/tris (often the case, for example eyeball meshes), the resulting mesh is jumbled:
Incorrect mesh topology
Re-topologizing isn't a good option since I sometimes have low-poly meshes with very specific topology, so I can't just set preserveTopology to false... I need a result with variable topology (i.e. MDLGeometryType.variableTopology).
How do I import these files correctly preserving their original topology?
I reported this as a bug at Apple Bug Reporter on 25th of November, bug id: 35687088
Summary: SCNSceneSourceLoadingOptionPreserveOriginalTopology does not actually preserve the original topology. Instead, it converts the geometry to all quads, messing up the 3D model badly. Based on its name it should behave exactly like preserveTopology of Model IO asset loading.
Steps to Reproduce: Load an OBJ file that has both triangles and polygons using SCNSceneSourceLoadingOptionPreserveOriginalTopology and load the same file into an MDLMesh using preserveTopology of ModelIO. Notice how it only works properly for the latter. Even when you create a new SCNGeometry based on the MDLMesh, it will "quadify" the mesh again to contain only quads (while it should support 3-gons and up).
On December 13th I received a reply with a request for sample code and assets, which I supplied 2 days later. I have not received a reply since (hopefully because they are just busy from catching up from the holiday season...).
As I mentioned in my bug report's summary, loading the asset with Model I/O does work properly, but then when you create a SCNNode based on that MDLMesh it ends up messing up the geometry again.
In my case the OBJ files I load have a known format as they are always files also exported with my app (no normals, colors, UV). So what I do is load the information of the MDLMesh (buffers, facetopology etc) manually into arrays, from which I then create a SCNGeometry manually. I don't have a complete separate piece of code of that for you as it is a lot and mixed with a lot of code specific to my app, and it's in Objective C. But to illustrate:
NSError *scnsrcError;
MDLAsset *asset = [[MDLAsset alloc] initWithURL:objURL vertexDescriptor:nil bufferAllocator:nil preserveTopology:YES error:&scnsrcError];
NSLog(#"%#", scnsrcError.localizedDescription);
MDLMesh * newMesh = (MDLMesh *)[asset objectAtIndex:0];
for (MDLSubmesh *faces in newMesh.submeshes) {
//MDLSubmesh *faces = newMesh.submeshes.firstObject;
MDLMeshBufferData *topo = faces.topology.faceTopology;
MDLMeshBufferData *vertIx = faces.indexBuffer;
MDLMeshBufferData *verts = newMesh.vertexBuffers.firstObject;
int faceCount = (int)faces.topology.faceCount;
int8_t *faceIndexValues = malloc(faceCount * sizeof(int8_t));
memcpy(faceIndexValues, topo.data.bytes, faceCount * sizeof(int8_t));
int32_t *vertIndexValues = malloc(faces.indexCount * sizeof(int32_t));
memcpy(vertIndexValues, vertIx.data.bytes, faces.indexCount * sizeof(int32_t));
SCNVector3 *vertValues = malloc(newMesh.vertexCount * sizeof(SCNVector3));
memcpy(vertValues, verts.data.bytes, newMesh.vertexCount * sizeof(SCNVector3));
....
....
}
In short, the preserveTopology option in SceneKit isn't working properly. To get from the working version in Model I/O to SceneKit I basically had to write my own converter.

webgl replace program shader

I'm trying to swap the fragement-shader used in a program. The fragment-shaders all have the same variables, just different calculations. I am trying to provide alternative shaders for lower level hardware.
I end up getting single color outputs (instead of a texture), does anyone have an idea what I could be doing wrong? I know the shaders are being used, due to the color changing accordingly.
//if I don't do this:
//WebGL: INVALID_OPERATION: attachShader: shader attachment already has shader
gl.detachShader(program, _.attachedFS);
//select a random shader, all using the same parameters
attachedFS = fragmentShaders[~~(Math.qrand()*fragmentShaders.length)];
//attach the new shader
gl.attachShader(program, attachedFS);
//if I don't do this nothing happens
gl.linkProgram(program);
//if I don't add this line:
//globject.js:313 WebGL: INVALID_OPERATION: uniform2f:
//location not for current program
updateLocations();
I am assuming you have called gl.compileShader(fragmentShader);
Have you tried to test the code on a different browser and see if you get the same behavior? (it could be standards implementation specific)
Have you tried to delete the fragment shader (gl.deleteShader(attachedFS); ) right after detaching it. The
previous shader may still have a pointer in memory.
If this does not let you move forward, you may have to detach both shaders (vertex & frag) and reattach them or even recreate the program from scratch
I found the issue, after trying about everything else without result. It also explains why I was seeing a shader change, but just getting a flat color. I was not updating some of the attributes.

Dynamic naming of objects in AudioKit (SpriteKit)

I am trying to create an app similar to the Reactable.
The user will be able to drag "modules" like an oscillator or filter from a menu into the "play area" and the module will be activated.
I am thinking to initialize the modules as they intersect with the "play area" background object. However, this requires me to name the modules automatically, i.e.:
let osci = AKOscillator()
where osci will automatically count up to be:
let osci1 = AKOscillator()
let osci2 = AKOscillator()
...
etc.
How will I be able to do this?
Thanks
edit: I am trying to use an array by creating an array of
var osciArray = [AKOscillator]()
and in my function to add an oscillator, this is my code:
let oscis = AKOscillator()
osciArray.append(oscis)
osciArray[oscCounter].frequency = freqValue
osciArray[oscCounter].amplitude = 0.5
osciArray[oscCounter].start()
selectedNode.userData = ["counter": oscCounter]
oscCounter += 1
currentOutput = osciArray[oscCounter]
AudioKit.output = currentOutput
AudioKit.start()
My app builds fine, but once the app starts running on the Simulator I get error : fatal error: Index out of range
I haven't used AudioKit, but I read about it a while ago and I have quite a big interest in it. From what I understand from the documentation, it's structured pretty much like SpriteKit: nodes connected together.
I guess then that most classes in the library derive from a base class, just like everything in SpriteKit derives from the SKNode class.
Since you are linking the audio kit nodes with visual representations via SpriteKit nodes, why don't you simply subclass from an SKSpriteNode and add an optional audioNode property with the base class from AudioKit?
That way you can just use SpriteKit to interact directly with the stored audio node property.
I think there's a lot of AudioKit related code in your question, but to answer the question, you only have to look at oscCounter. You don't show its initial value, but I am guessing it was zero. Then you increment it by 1 and try to access osciArray[oscCounter] which has only one element so it should be accessed by osciArray[0]. Move the counter lower and you'll be better off. Furthermore, your oscillators look like local variables, so they'll get lost once the scope is lost. They should be declared as instance variables in your class or whatever this is part of.

How do you convert Wavefront OBJ file to an SCNNode with Model I/O

I've imported a Wavefront OBJ file from a URL and now I'd like to insert it into my scene (SceneKit) on my iOS 9 app (in Swift). What I've done so far is:
let asset = MDLAsset(URL: localFileUrl)
print("count = \(asset.count)") // 1
Any help converting this to a SCNNode would be appreciated. According to to Apple's docs:
Model I/O can share data buffers with the MetalKit, GLKit, and SceneKit frameworks to help you load, process, and render 3D assets efficiently.
But I'm not sure how to get buffer from an MDLAsset into a SCNNode.
Turns out this quite easy as many of the ModelIO classes already bridge. I was doing import ModelIO which gave me access to all the ModelIO classes and likewise import SceneKit which gave me the SceneKit classes, but, I was missing import SceneKit.ModelIO to bring in the SceneKit support for ModelIO.
let url = NSURL(string: "url-to-your-obj-here")
let asset = MDLAsset(URL: url!)
let object = asset.objectAtIndex(0)
let node = SCNNode(MDLObject: object)
Easy as that...

Resources