I did a search on ARCore-enabled devices.
https://developers.google.com/ar/discover/supported-devices
It doesn't say whether it will support the Galaxy Tab S5E.
Will ARCore work on the Galaxy Tab S5E?
At the moment there's no Samsung Galaxy Tab S5E in a list of ARCore-supported devices because it's new – Google did not have a time to update a list.
But I'm 99% sure it'll be on a list because Samsung Galaxy Tab S5E has all the required sensors for AR experience: Gyroscope, Accelerometer, back RGB camera and Magnetometer.
Why not 100%? Because business policy and hardware issues are unpredictable things.
Related
The ARKit API supports simultaneous world and face tracking via the back and front cameras, but unfortunately due to hardware limitations, the new iPad Pro 2020 is unable to use this feature (probably because the LIDAR camera takes a lot more power). This is a bit of a step back.
Here is an updated reference in the example project:
guard ARWorldTrackingConfiguration.supportsUserFaceTracking else {
fatalError("This sample code requires
iOS 13 / iPad OS 13, and an iOS device with
a front TrueDepth camera. Note: 2020 iPads
do not support user face-tracking while world tracking.")
}
There is also a forum conversation proving that this is an unintentional hardware flaw.
It looks like the mobile technology is not "there yet" for both. However, for my use case I just wanted to be able to switch between front and back tracking modes seamlessly, without needing to reconfigure the tracking space. For example, I would like a button to toggle between "now you track and see my face" mode and "world tracking" mode.
There are 2 cases: it's possible or it's impossible, but maybe there are some alternative approaches depending on that.
Is it possible, or would switching AR tracking modes necessitate setting-up the tracking space again? If so, how would it be achieved?
If it's impossible:
Even if I don't get face-tracking during world-tracking, is there a way to get a front-facing camera feed that I can use with the Vision framework, for example?
Specifically: how do I enable back-facing tracking and get front and back facing camera feeds simultaneously, and disable one or the other selectively? If it's possible even without front-facing tracking and only the basic feed, this will work.
Is it practically possible for a developer right now (or at least theoretically in the future) to develop an App that can measure via UWB, the distance towards other Iphones?
UWB technology can take different forms in terms of ranging techniques. How does (or will) ranging work in these iPhones?
I have seen the iPhone11 schematic, found uwb chip U1 support 5 antennas, three of which are used for AOA positioning, the other two support uwb channel 5 and channel 9 data transmission, Now the iPhone and AirTag positioning, should be carried out through 3 AOA antennas, the other 2 should be used to support the iPhone as a tag (position with other iPhone phones, or anchor/node) use.
Yes. It might be an late answer, iOS provides Nearby Interaction framework to get distance and direction information.
https://developer.apple.com/documentation/nearbyinteraction
Is it possible to always use the GPS chip in the Apple Watch for a WatchKit app, even when a iPhone is connected (for Apple Watch series 2 and above because these models have a GPS chip on board).
For my app on watchOS I need the most accurate GPS data. Unfortunately, when an iPhone is connected location request (using the standard code) are delegated to the iPhone and the GPS chip of the iPhone is used. This makes perfectly sense from a power conservation perspective, but not from an accuracy perspective. The iPhone could be tucked away in a coat, bag etc. In that case the phone has no clear view to the sky, while the watch has. The GPS data from the watch are much, much more accurate in that case (and the iPhone GPS data extremely inaccurate)
Is there a way to configure the CLLocationManager to block delegation of GPS location requests to the iPhone and always use the GPS chip in the watch?
It is weird for me to come here and see that P2 line is ARCore capable but mate 10 line is not, and I would like to know why if the hardware in P20 Pro is almost the same except for the RAM and one more lens, it just doesn't make any sense for me.
As far as I know it has something about calibrating each phone based on the camera and motion sensors, and their locations on the phone. So even though their specifications might seem similiar there still are differences on location of the sensors and the cameras.
They might add support in the future.
Keep checking this page for supported devices, as I remember when it first became available it was not supported on devices like Galaxy S8 and S8+ and later support was added so keep an eye out for it.
If most of the devices are not supported ARCore, then why is Pokemon Go running on every device?
My device is not supported by ARCore but Pokemon Go is on it with full performance.
Why?
Until October 2017, Pokemon Go appears to use a Niantic made AR engine. At a high level, the game placed the Pokemon globally in space at a server defined location (the spawn point). The AR engine used the phone’s GPS and compass to determine if the phone should be moved to the left or to the right. Eventually, the phone pointed to the right heading and the AR engine drawed the 3D model over the video coming from the camera. At that time there was no attempt to perform mapping of the environment, surface recognition, etc. That was a simple, yet very effective technique which created the stunning effects we’ve all seen.
After that Niantic has shown prototypes of Pokemon GO using ARKit for iOS. It is easy to notice enhancements: missed pokeballs appear to bounce very naturally on the sidewalk and respect physics, it feels like Pikachu naturally walks on the sidewalk as opposed to floating in the air with the currently released game. Most observers expected Niantic to replace the current engine with ARKit (iOS) and ARCore (Android), possibly via Unity 3D AR APIs.
In early 2018 Niantic improved the aspect of the game on Android by adding support for ARCore, Google’s augmented reality SDK. And a similar update to what we’ve already seen on iOS 11, which was updated to support ARKit. The iOS update gave the virtual monsters a much greater sense of presence in the world, due to camera tracking, allowing them to more accurately stand on real-world surfaces rather than floating in the center of the frame. Android users will need a phone compatible with ARCore in order to use the new “AR+” mode.
Prior to AR+, Pokémon Go would use rough approximations of where objects were to try and place the Pokémon in your environment, but it was mostly a clunky workaround that functioned mostly as a novelty feature. The new AR+ mode also lets iOS users take advantage of a new capture bonus, called expert handler, that involves sneaking up close to a Pokémon, so as not to scare it away, in order to more easily capture it. With ARKit, since it’s designed to use the camera with the gyroscope and all the sensors, it actually feeds in 60 fps with full resolution. It’s a lot more performant and it actually uses less battery than the original AR mode.
For iOS users there's a standard list of supported devices:
iPhone 6s and higher
iPad 2017 and higher
For Android users not everything is clear. Let's see why. Even if you have an officially unsupported device with poor-calibrated sensors you can still use ARCore on your phone. For example, ARCore for All allows you do it. So for Niantic, as well, there's no difficulties to make every Android phone suitable for Pokemon Go.
Hope this helps.