SWIFT: IMDF Indoor Map enable 3D effect instead of flat surface - ios

I am playing around with IMDF from the new Apple Indoor Maps Program.
However I am experimenting with 3D effect of the building (walls etc.).
With official maps app we can achieve this effect easily by panning the camera in an angle between 89° and 45° like so:
Now I tried to achieve this by enabling pitch gesture self.mapView.isPitchEnabled = true to the example app for IMDF from apple but the result is only a flat surface.
Now I tried to enable buildings self.mapView.showsBuildings = true, because I thought enabling buildings could trigger some function to render IMDF-Data to 3D, but the result was disappointing.
As we can see the IMDF-Data renders only some kind of overlay to the map.
Looking to the documentation of IMDF, airport indoor maps are also done with IMDF, so where is the trick?
Is it even possible to get the 3D-effect as a private developer?
Update 29.06.2020:
Tried at 08.06.2020 to ask an apple developer with the specific email for IMDF-questions provided by apple. imdfquestions#apple.com. No answer so far.
Asked the same question at the apple developer forum. No answer either.

Related

Does Apple's PencilKit support that Drawing and Holding to Create Perfect Shapes, if not, how to implement it?

What I want:
The picture is from Notes app in iPad. The blurred red path I drew with Apple Pencil. Holding and then the perfect circle shape generated.
I checked the Apple's developer documentation about PencilKit. It seems no related API to support it.
But apple's Notes app of the system in iPad can do it. Does it mean that Apple have not opened the relevant API? However, I really need to implement Drawing and Holding to Create Perfect Shapes in our app. How to do it?
The only thing I can think of is to use coreML to reshape pencil path? I know very little about it.

Share 3D objects in a city and have people see them with ARKit

I’m looking for something that lets people place 3D objects around a city and have people see them exactly where they were placed, even with years apart. I’m looking for something more asynchronous than a real-time experience so ARWorldMap is out of the equation. Google Cloud Anchors does not work with provided GPS coordinates either. I've looked into .gravityAndWorldMap, how can I use it? Is this project something feasible with ARKit?
Try Sturfee, works on top of ARKit. Only available from within Unity and some cities.
https://sturfee.com/

Creating a 360 photo experience on iOS mobile device

I am interested in VR and trying to get a bit more information. I want to create a similar experience on iOS where I can take a 360 image and be able to view it on a iOS device by tilting the phone around and using the devices gyroscope, as I tilt the phone around it will pan around the 360 image (like on google street view where you can use the tilt gesture).
And something similar to this app: http://bubb.li/
Can anybody give a brief overview how this would be do-able, any sources that could help me achieve this, API's etc...?
Much appreciated.
Two options here: You can use a dedicated device to capture the image for you, or you can write some code to stitch together multiple images taken from the iOS device as you move it around a standing point.
I've used the Ricoh Theta for this (no affiliation). They have a 360 viewer in the SDK for mapping 360 images to a sphere that works exactly as you've asked.
Assuming you've figured out how to create 360 photospheres, you can use Unity and Unreal, and probably development platforms to create navigation between the locations you captured.
Here is a tutorial that looks pretty detailed for doing this in Unity:
https://tutorialsforvr.com/creating-virtual-tour-app-in-vr-using-unity/
One pro of doing this in something like Unity or Unreal is once you have navigation between multiple photo spheres working it's fairly easy to add animation or other interactive elements. I've seen interactive stories done with 360 video using this method.
(I see that the question is from a fairly long time ago, but it was one of the top results when I looked for this same topic)

Reproduce Google Heart with iOS7 MapKit's custom tiles

I would love to reproduce GoogleHeart-like 3D map flyover even when offline.
As of iOS 7 MapKit allows us to draw custom offline tiles. It also allows us to set a Camera in order to see the map in 3D or 2.5D as you may wish to call it.
I was wondering: can I draw a 3D shape like Apple does for its flyover feature, on my custom tiles?
I need to apply a "bump-map" to the map in order to get a GoogleHeart-like 3D view and I was wondering if Apple would allow me to do just that with iOS 7 and custom tiles rendering + camera settings.
Thanks
I have experimented pretty extensively with this, but there is no supported way to do this. Right now, Apple only offers raster tile-based overlay, albeit with automatic 2.5/3D transformation when overlaid on a map. Hopefully in the future they will support 3D API and/or custom, say, OpenGL-based augmentation to the map.

Does Google Maps SDK for iOS display 3D maps?

I have integrated Google Maps SDK to an iOS application and I would like to display 3D Satellite maps. According to the documentation this should work just directly. I can tilt the view, but the displayed map remains flat (i.e. mountains do not show up in 3D as they do in Google Earth).
I have been searching extensively for this, but found no reference or mentioning whether it actually works or does not. Does anybody know whether the 3D maps (google SDK) do work on iOS and I am just hitting some limitation/wrong switch or whether they do not work?
As of SDK v1.8, tilted layers do appear to have some 3D elevation effects, but it's more subtle than Google Earth typically is.

Resources