Is Markered AR possible on small, mostly round markers? - augmented-reality

I've recently started learning about AR and would like to develop an app to use with a smart device we have at home. The app would show you which button on the controller to press, after detecting the button as a marker. The thing is, the more I research the topic, the less confident I am this is possible. My main concerns are that the button is too small (around 3cm x 3cm) and that its design is too round, which is a known issue when detecting features.
My question is if it's even possible (especially for an AR beginner) to develop a somewhat accurate application with this kind of marker?
Button for reference:

Related

Why is Pokemon Go running on unsupported devices?

If most of the devices are not supported ARCore, then why is Pokemon Go running on every device?
My device is not supported by ARCore but Pokemon Go is on it with full performance.
Why?
Until October 2017, Pokemon Go appears to use a Niantic made AR engine. At a high level, the game placed the Pokemon globally in space at a server defined location (the spawn point). The AR engine used the phone’s GPS and compass to determine if the phone should be moved to the left or to the right. Eventually, the phone pointed to the right heading and the AR engine drawed the 3D model over the video coming from the camera. At that time there was no attempt to perform mapping of the environment, surface recognition, etc. That was a simple, yet very effective technique which created the stunning effects we’ve all seen.
After that Niantic has shown prototypes of Pokemon GO using ARKit for iOS. It is easy to notice enhancements: missed pokeballs appear to bounce very naturally on the sidewalk and respect physics, it feels like Pikachu naturally walks on the sidewalk as opposed to floating in the air with the currently released game. Most observers expected Niantic to replace the current engine with ARKit (iOS) and ARCore (Android), possibly via Unity 3D AR APIs.
In early 2018 Niantic improved the aspect of the game on Android by adding support for ARCore, Google’s augmented reality SDK. And a similar update to what we’ve already seen on iOS 11, which was updated to support ARKit. The iOS update gave the virtual monsters a much greater sense of presence in the world, due to camera tracking, allowing them to more accurately stand on real-world surfaces rather than floating in the center of the frame. Android users will need a phone compatible with ARCore in order to use the new “AR+” mode.
Prior to AR+, Pokémon Go would use rough approximations of where objects were to try and place the Pokémon in your environment, but it was mostly a clunky workaround that functioned mostly as a novelty feature. The new AR+ mode also lets iOS users take advantage of a new capture bonus, called expert handler, that involves sneaking up close to a Pokémon, so as not to scare it away, in order to more easily capture it. With ARKit, since it’s designed to use the camera with the gyroscope and all the sensors, it actually feeds in 60 fps with full resolution. It’s a lot more performant and it actually uses less battery than the original AR mode.
For iOS users there's a standard list of supported devices:
iPhone 6s and higher
iPad 2017 and higher
For Android users not everything is clear. Let's see why. Even if you have an officially unsupported device with poor-calibrated sensors you can still use ARCore on your phone. For example, ARCore for All allows you do it. So for Niantic, as well, there's no difficulties to make every Android phone suitable for Pokemon Go.
Hope this helps.

Placing objects automatically when ground plane detected with vuforia

I'm working on an application where the concept is that you can 'select' objects before actually placing them. So what I wanted to do was have some low quality objects on a shelf or something like it. When the user selects the object he then can tap to place the high quality version of the object in his area for further viewing.
I was wondering if it's possible with vuforia. I wanted to use this platform since it works well from what I could tell and it's cross platform (The application needs to be for android and the HoloLens).
I have set up the basic application where you can place a capsule in the area. Now I wanted to automatically place the (in this case capsule) once vuforia has detected a ground plane. From what I could see the plane finder has events that go off when an input is detected, but I couldn't find an event that goes off when the ground plane is detected. Is it still possible with vuforia? I know it's doable with the HoloLens, but I would like to know if it's possible for android or other mobile devices. I really don't know where to start/look for so I hope someone can point me in the right direction.
Let me know if I need to include more information!
The Vuforia PlaneFinderBehaviour (see doc here) has the event OnAutomaticHitTest which fires every frame a ground plane is detected.
So you can use it to automatically spawn an object.
You have to add your method in the On Automatic Hit Test instead of the On Interactive Hit Test list of the "Plane Finder":
I've heard that vuforia fusion, does not yet support ARCore (it supports ARKit) so it uses an internal implementation to simulate ARCore functionality, and they are waiting for a final release of ARCore to support it. Many users reported that their objects move even when they use an ARCore supported device.

ARKit little jumps during track

I've been using ARkit and I'm loving it, but I noticed that during load tracking can gut a little jumpy (suddenly objects jump from their position a little bit off, like 1-3 cms). I've been wondering if there's a way to smooth out these jumps so it wouldn't be so distracting. Here is a short video demonstrating it.
https://youtu.be/wmMBjlLyK7w
I have been using ARKit and am also loving it I have been experiencing these issues as well and I have my theories but I am positive it is an issue with the hardware (comment which device you are using and I might be able to give a better estimate)
I believe it is the cameras on our devices and if that is the case then I would not worry about it too much because that would mean a behind the scenes problem we cant change or alter
If I'm not mistaken I remember Apple saying something about this in one of their developer classes earlier in this months keynote as I said before I wouldn't worry about it because older devices will have a harder time with the tracking because of the poorer cameras

How to Visualize zPositions in iOS

My team and I are working on a SpriteKit based iOS game of medium complexity. There are lots of layers and nodes to the design of the game and the zPositioning of the nodes has gotten sloppy. One task I have agreed to take on is the revamping of our zPosition strategy: moving to constants instead of magic numbers, having a holistic zPosition scheme for the app, etc. but first I want to analyze where we are at now. So here is my question:
I vaguely recall watching a WWDC video (or some other tutorial, maybe) in which the person showed using some aspect of Instruments (or some other tool) to show a 3D rendering of an app, seen from an isomorphic angle, based on the zPosition of the SKNodes (or UIKit elements?) in the app.
Does anyone here know what tool this is? And if not, what is the best way to visualize the current state of zPositions in a SpriteKit based app? Thanks!

Hourglass for iPhone

I'm trying to exploring iOS application right now. I got a simple idea, which is to create an hourglass application. I did found some apps on the App Store, but the animation is not quite satisfying. Lets say, if I want to create a smooth animation of the sand falling down, the gravity, is there any advice on where should I start? I am thinking to use OpenGL ES for the animation. not sure if this is a good idea, since I do not know how to keep the sand in the hourglass, and each grain falls perfectly fine. is there any algorithm that I should look into?

Resources