Sprite Kit vs Scene Kit for Building information modelling - ios

I am trying to develop a building information model viewer for iPad and I am faced with the current challenge. Should I use SpriteKit or SceneKit? I know SceneKit is meant for rendering 3D while SpriteKit is 2D. From my research so far, SceneKit seems more appropriate for Building Information Modelling as it will represent a 3D Model of a building. However, I would like to know if I can do it with SpriteKit (I read SpriteKit is easier to learn) or should I used SceneKit? Thanks for your input. I am new to iOS dev, so any assistance would be helpful.

SceneKit and SpriteKit are very similar to each other. SceneKit is a little harder to learn but it's pretty simple. SceneKit would be the only way to have a 3D model(With the options you provided). You can have a SpriteKit scene over top of the SceneKit scene to display labels that stay put.

You'll probably want to use both of them. SceneKit represents 3D very nicely, but can also accept SpriteKit scenes to use for backgrounds, foreground overlays, and object textures/materials.

You can assume SpriteKit is at Top of the SceneKit, As using SceneKit you can add 3D models into Augmented Reality while SpriteKit is used to add extra sprites onto the model.
In short SpriteKit is revolution in Gaming.

Related

Can we develop LiDAR apps using ARKit with SceneKit?

I have read on many forums that if we want to develop a LiDAR application, we need to use RealityKit, instead of SceneKit. I am in the middle of development of Apple LiDAR Tutorial. But instead of using RealityKit, I used SceneKit. But now I got a problem since SceneKit doesn't offer sceneUnderstanding feature to render graphics. So I want to know basically:
Can't we develop LiDAR applications using ARKit with SceneKit?
Can we achieve sceneUnderstanding feature using SceneKit?
Can we develop LiDAR apps without using sceneUnderstanding?
Really appreciate your answers and comments. Thank you.
You can use scene understanding with any renderer. But only RealityKit comes with integration for this feature.
The ARWorldTrackingConfiguration comes with a sceneReconstruction flag that can be enabled.
Then, ARKit creates ARMeshAnchor instances for you in the ARSessionDelegate and ARSCNViewDelegate methods.
However, because SceneKit does not come with an out-of-the box support for these features you would have to build a visualization or physics interaction yourself based on the ARMeshAncor properties.

iOS - Combining SpriteKit and Metal

Is it possible to combine SpriteKit with Metal? and if it is, how could one achieve to combine metal particles and SKNodes in a physics world so that the collide with each other, what's the usual approach for this kind of requirement.
Thanks
They are two totally different technologies. Sprite Kit is a framework that abstracts all of the rendering work for you as well as provides you with a built-in physics engine. Whereas Metal is purely a low-level GPU-accelerated graphics API which gives you complete control over the rendering process. It is similar to OpenGL ES but with much less overhead.
Sprite Kit will use Metal (on eligible devices) to render your scene. You don't need to do a single thing because Sprite Kit handles all rendering behind-the-scenes.
You don't combine them, they are two totally different frameworks. If you are looking to add physics to Metal then you will either need to write your own physics engine or use an an already existing engine like Box2D (which I believe Sprite Kit uses internally).
This appears to be possible now using SKRenderer which allows you to mix SpriteKit and Metal (by the looks of it adding SpriteKit to Metal and vice versa).
It's iOS 11+, macOS 10.13+ and tvOS 11+.

Workflow between blender and xcode (scenekit)

Im using blender to create landscape for a game being built with scenekit.
As it is just a landscape, I won't be using any animations so I'm wondering before I dive too deep into blender, should I be using blender to create the geometry and then create my own materials in scenekit?
I could still create the shadow, emission, specular etc. maps in blender but would there be a performance benefit or penalty for doing it this way?
Also if this is a path I could take should I then be exporting as .dae or is there a way to export to a normal map that xcode would be happy with?
SceneKit supports materials exported in DAE from Blender. It doesn't support every possible shading option that Blender has, but unless you're doing exotic stuff it should cover most of what you're looking for.
At run time there's no difference between materials loaded from DAE and those created programmatically.
What you do want to think about at authoring/export time is stuff like real-time versus static lighting/shadows and high-poly geometry versus baked normal maps. In other words, material performance is more about how the materials are set up (complexity) than where they're set up (imported or at run time). See WWDC 2014 session 610: Building a Game with SceneKit for some tips.

Individual particles and physics in Sprite Kit

I am a long time user of Stackoverflow but first post.
My question is seemingly simple, is there a way to make particles from an emitter interact with the physics sprites in the scene? (For example, if I am using a particle for rain, and I want it to bounce or bumpy off a sprite of a man with an umbrella. There must be a way, but I don't see a lot of documentation on adding physics to individual particles. Any ideas?
Thanks!
No. There is no way to make SpriteKit's built in particles interact with physics bodies. Every particle property you can control is a property of SKEmitterNode, and it has no properties for setting physics behavior for particles.
The fact is that particles are designed to be very light-weight so that you can have thousands of them on any hardware supported by SpriteKit. Physics simulation is not light-weight.
There is LiquidFun, which is a Box2D extension that simulate the physics of a particle system. This engine is the basis for Apple Spritekit physics engine and you can use it in your game but you have to tweek it a little bit to make it run. There are a lot of tutorials of how to use it in an ios project. I am confident that Apple will have more features added to Spritekit in the future that make the particle system respond to physics.
You could use a SKField to simulate gravity and then another field on your umbrella to repulse it.

Blender-File to Xcode

we want to make a 3D Game for Apple iPad and we ar thinking about
a possibility, to import 3D-Models from Blender into Xcode.
Is there a way to do that?
We want to use opengl-es 2.0.
XCode isn't a game engine or 3D SDK. You can't 'import' blender files into XCode. What you can do is take your 3D assets and use them within your apps, either directly through OpenGL (rather low level), or using a 3D engine such as Unity (easier).
Either way, there are a number of questions already on Stackoverflow that you might find useful:
Opengl ES how to import a 3D model and map textures to it on runtime
Iphone + OpenGL ES + Blender Model: Rotation by Touch
Choosing 3D Engine for iOS in C++
...I highly recommend you take a look at what options are out there, decide on the best way to implement your 3D game (be it raw OpenGL or an engine), and go from there.

Resources