how to add custom UIView in scnview of arkit? - ios

I want to add 'UIview' in the scnview of arkit.
How to create a node of UIView to add my uiview in the addchildnode() of scnview?

You can now. It’s possible to add a UIView as an SCNMaterialProperty.
See the following post how to do so;
Using a UIView in 3D in Scenekit

ARKit isn't a rendering engine — it doesn't display any content for you. ARKit provides information about real-world spaces for use by rendering engines such as SceneKit, Unity, and any custom engine you build (with Metal, etc).
SceneKit can't render a UIView as part of a 3D scene. But it can render planes, cubes, or other shapes, and texture-map 2D content onto them. If you want to draw a text label on a plane detected by ARKit, that's the direction to investigate — follow the example's, um, example to create SCNPlane objects corresponding to detected ARPlaneAnchors, get yourself an image of some text.

Related

RealityKit: Render an object under a grounding shadow

I have a placement marker (a simple plane with green corners) to visualize detected planes (via ARRaycastResult) in an ARView. This placement marker uses UnlitMaterial with a texture. Everything works fine as long as there is no other object added.
When I add another object, RealityKit also adds grounding shadow (invisible plane) right under the object. It works as a shadow plane and occludes everything behind it - including my placement marker.
Here is a picture of the placement marker (part of which is hidden under the shadow plane):
Is there any way to prevent this clipping? I was lookig for something like rendering order (as ARKit does have one), but have not fond anything in RealityKit yet.
I would like to keep the shadow plane if possible.
Edit: Added official name of the invisible plane (grounding shadow)
Reality kit automatically adds "grounding shadows" if the AnchoringComponent.Target is of type "plane". According to an Apple Engineer this can be aso simulated using a DirectionalLight with a Shadow and placing a plane (with an OcclusionMaterial) under the model. So the grounding shadow is probably also made out of OcclusionMaterial. This also explains why this grounding shadow occludes other objects.
There is an option to disable rendering of these grounding shadows. It can be done by inserting of disableGroundingShadows option into ARView's renderOptions property.
arView.renderOptions.insert(.disableGroundingShadows)
I have not yet found a way to override OcclusionMaterial (if any).

Draw on face of SCNNode

I am experimenting with ARKit and SceneKit and have been able to place boxes or custom shapes using UIBezierPath. What I’d like to do next is draw text and place images on the surface of these shapes. I’ve tried adding an image to the material property but this just fills the shape with an image. I’d like to have control over the size and position of the image / text relative to the shape. Is this possible?
You can do so by creating a new node for your text or image, and add it as a child node of your existing SCNNode (addChildNode). You can then position it as needed relative to your parent node, much as you have already done with your SCNNode.
For text you can make your SCNNode from an SCNText, or render the text into a bitmap and use it as a texture material of an SCNPlane.

How do I create a own coordinate system for a SKScene?

My goal is to take an existing SKScene and stretch it according to a polynomial function, like one stretching everything toward or away from the center. The stretched form will be continuously rendered and presented to the user. It may be a new scene/image/view or whatever is necessary. The model will simply perform its functions over time in Euclidean form.
My project content is little more than the iOS SpriteKit starter project on Xcode.
I know of the functions in SKScene:
convertPointToView(), and
convertPointFromView()
However, I don't understand how these will be much use for the view since the scene only has aspect fill, fits and resize settings.
I attempted to make a fragment shader to do the actual stretching, however, I could not figure out how to get existing color and position information to draw the new color according to the transformation.
I am using SpriteKit and I only know how to access fragment from among the shaders using SKShader. I do not know how to access vertex shaders from this context. Otherwise, I would have tried to use a vertex shader.
You could go with SceneKit:
Create SCNView with SCNScene
Create SCNNode with SCNPlane geometry (or create custom SCNGeometry)
Create SCNMaterial with your SKScene attached to created material's diffuse.contents property
Attach material to geometry
Attach node to scene
Then you have multiple choices:
Use SCNShadable - either attach shader modifier for geometry or material, or use custom SCNProgram.
Use SCNTechnique on SCNView.
This way you will have your SKScene as texture on a 3D object (plane or something) and have full control of vertex and fragment shaders.

CoreGraphics elements inside SKNode

I'm building game-type app using SpriteKit. In one of the scenes I want to create an area, in with user will be able to draw. Sadly using SKShapeNodes produces jagged lines and causes FPS to drop. I thought about using Core Graphics method, but I need drawn lines to be a part of a Node. So is there a way to use Node as a canvas for CG?
From the SKNode documentation:
Unlike views, you cannot create SKNode subclasses that perform custom drawing.
So I think the answer is no, you can't do that.
Each node does have a scene property, and the scene does have a link to the view that contains it. But the thing that makes sprite animation fast is that sprites are canned -- the images have already been drawn and just need to be copied. Node types other than SKSpriteNode are similarly optimized for speed. Accordingly, there are no drawing methods in the sprite classes -- no opportunity for your code to do custom drawing.
You can draw to a CGImage and create a SKTexture from that using textureWithCGImage:

SKPhysicsBody with custom view?

With UIDynamics it is simple to have the physics objects, called dynamic items, control custom views (or indeed anything you want) using the protocol UIDynamitItem with the three properties center, transform and bounds. I would like to do something similar with SpriteKit, but according to the manual:
Unlike views, you cannot create SKNode subclasses that perform custom drawing
Specifically, I would like to have the physics bodies control some vector graphics I currently have in a drawrect. There are two things I am after here:
The first is to let the vector graphics move around like any other node.
The second is to let position, angle and other properties change the exact position of some of the control points.
I am fairly certain that I could achieve this with UIDynamics and dynamic items, but then I wouldn't be able to use actions and other nice sprite kit features.
It would also seem that 1 could be handled by converting the paths to cgpaths and using shape nodes, but that would be limiting, and not cover 2.
Any suggestions?

Resources