Where can I find support for making ARCore compatible Hardware? - arcore

We make AR headsets. I wanted to know where we can get support about which sensor modules to use to make hardware ar core compatible.

Related

Metal 2 API features on older devices

According to documentation(https://developer.apple.com/documentation/metal/gpu_features/understanding_gpu_family_4) "On A7- through A10-based devices, Metal doesn't explicitly describe this tile-based architecture". In the same article I seen "Metal 2 on the A11 GPU" and get confused because not found any more info about Metal 2 support in metal shading language specification. For example I found table "Attributes for fragment function tile input arguments" and note "iOS: attributes in Table 5.5 are supported since Metal 2.0."
Is Metal 2 support specific for gpu family?
Not all features are supported by all devices. Newer devices generally support more features, older devices might not support newer features.
There are several factors of this support.
First, each MTLDevice has a set of MTLGPUFamily it supports that you can query with supportsFamily method. Some documentation articles mention what kind of family the device needs to support to use this or that feature, but generally, you can find that info in the Metal Feature Set Tables. The support for the families may vary depending on the chip itself, how much memory or some other units is available to it. And the chips are grouped into families based on those.
There are other supports* queries on an MTLDevice though, that don't depend on the family of the device, but rather on a device itself. Like, for example, supportsRaytracing query. These are also based on the GPU itself, but are separate probably because they don't fall neatly into any of the "families".
Third kind of support is based on an OS version. Newer versions of OS might ship new APIs or an extensions to existing APIs. Those are marked with API_AVAILABLE macroses in the headers and may only be used on the OSes that are the same version or higher. To query support for these ones, you need to use either macroses or if #available syntax in Objective-C or similar syntax in Swift. Here, the API availability isn't so much affected by the GPU itself, but rather by having newer OS and drivers to go with it.
Last kind of "support" to limit some features is the Metal Shading Language version. It's tied to the OS version, and it refers to those notes in the Metal Shading Language specification you mentioned in your question. Here, the availability of the features is a mix of limitations of a compiler version (not everyone is going to use latest and greatest spec, I think most production game engines are using something like Metal 2.1, at least the games that aren't using latest and greatest game engine versions do) and the device limitations. For example, tile shaders are limited to a version of a compiler, but also they are limited to Apple Silicon GPUs.
So there are different types of support at play when you are using Metal in your application and it's easy to confuse them, but it's important to know each one.

ARCORE on 96 hickey board

So I want to run ARcore on hickey board, I am currently running Android 9.0 AOSP on it. Is it possible to run an AR application on hickey board with an external USB camera? do I need a specific camera or is there anything else I need in order to run AR applications on hickey board? or hickey even supports the AR core? if you could answer this it would really help me.
Thank you.
ARCore has some minimum system requirements including sensors, cameras etc.
I don't believe there is an official minimum requirements set published openly at this point but there is a list of supported devices: https://developers.google.com/ar/discover/supported-devices
Google actually test and certify these devices so I don't think you will find 'official' support for you set up - they say:
To certify each device, we check the quality of the camera, motion sensors, and the design architecture to ensure it performs as expected. Also, the device needs to have a powerful enough CPU that integrates with the hardware design to ensure good performance and effective real-time calculations.

Why does ARKit require iOS 11 and an A9 when other sdks liek Vuforia or Wikitude can do the same? with lower iOS versions

I'm asking myself why ARKit and also ARCore require higher OS Versions than other SDKs like Vuforia or Wikitude? It seems like they can do what ARKit and ARCore also can do. Or are there some limitations when it comes to tracking?
Thanks in advance!
ARKit is a framework contained in the iOS framework and (my guess is) it relies on other parts of the framework in order to function. Not only that, but all native frameworks are bundled and distributed through iOS updates, in comparison to frameworks like Vuforia and Wikitude which are completely separate and don't rely on internal iOS frameworks.
Not only this, but frameworks like ARKit and ARCore do a lot more than the other two examples by utilizing the devices' hardware to ensure better accuracy and to deliver an overall better experience.
These techniques are quite demanding computationally so it would make sense for the A9 chip to be a requirement in order to carry out the calculations required to achieve this accuracy, and this is on top of the graphical requirements in the ARKit framework.

General GPU programming on iPhone [duplicate]

With the push towards multimedia enabled mobile devices this seems like a logical way to boost performance on these platforms, while keeping general purpose software power efficient. I've been interested in the IPad hardware as a developement platform for UI and data display / entry usage. But am curious of how much processing capability the device itself is capable of. OpenCL would make it a JUICY hardware platform to develop on, even though the licensing seems like it kinda stinks.
OpenCL is not yet part of iOS.
However, the newer iPhones, iPod touches, and the iPad all have GPUs that support OpenGL ES 2.0. 2.0 lets you create your own programmable shaders to run on the GPU, which would let you do high-performance parallel calculations. While not as elegant as OpenCL, you might be able to solve many of the same problems.
Additionally, iOS 4.0 brought with it the Accelerate framework which gives you access to many common vector-based operations for high-performance computing on the CPU. See Session 202 - The Accelerate framework for iPhone OS in the WWDC 2010 videos for more on this.
Caution! This question is ranked as 2nd result by google. However most answers here (including mine) are out-of-date. People interested in OpenCL on iOS should visit more update-to-date entries like this -- https://stackoverflow.com/a/18847804/443016.
http://www.macrumors.com/2011/01/14/ios-4-3-beta-hints-at-opencl-capable-sgx543-gpu-in-future-devices/
iPad2's GPU, PowerVR SGX543 is capable of OpenCL.
Let's wait and see which iOS release will bring OpenCL APIs to us.:)
Following from nacho4d:
There is indeed an OpenCL.framework in iOS5s private frameworks directory, so I would suppose iOS6 is the one to watch for OpenCL.
Actually, I've seen it in OpenGL-related crash logs for my iPad 1, although that could just be CPU (implementing parts of the graphics stack perhaps, like on OSX).
You can compile and run OpenCL code on iOS using the private OpenCL framework, but you probably don't get a project into the App Store (Apple doesn't want you to use private frameworks).
Here is how to do it:
https://github.com/linusyang/opencl-test-ios
OpenCL ? No yet.
A good way of guessing next Public Frameworks in iOSs is by looking at Private Frameworks Directory.
If you see there what you are looking for, then there are chances.
If not, then wait for the next release and look again in the Private stuff.
I guess CoreImage is coming first because OpenCL is too low level ;)
Anyway, this is just a guess

Is it possible to develop for the Kinect sensor without having an Xbox 360?

Is it possible to develop for the Kinect sensor without having an Xbox 360?
We would like to use the Kinect to develop an augmented reality application, but we're not sure if we need to get an Xbox for this. Do we have to, or can we develop using other platforms?
Yes, there are other APIs for interacting with the Kinect.
Microsoft has released it's beta API:
http://research.microsoft.com/en-us/um/redmond/projects/kinectsdk/
The caveat with Microsofts' API is that it cannot be licensed for commercial use. It's also a beta, so functionality is not locked, and there may be bugs.
OpenKinect is an open-source alternative, but it requires a little more work to get up and running:
http://openkinect.org/wiki/Main_Page
Yes ! it is very much possible to develop applications and devices on Kinect without the XBOX. Particular instances are in robotics (http://turtlebot.com/) and 3D printing (http://www.makerbot.com/blog/2011/05/26/3d-printing-with-kinect/). You should be able to get your project done using OpenNI drivers (http://www.openni.org/).

Resources