iOS Augmented Reality Framework and some guidance - ios

I have done some iOS development before never quite something that handled the Camera (which is what I think I need to do) Could you point me in the right direction?
What I was requested to do is basically to have a QR code reader that can display certain information (images/video/text) and then take it further up a notch by adding augmented reality (not necessarily using said QR codes but within the same app).
I've looked for some Augmented reality frameworks for iOS and I found SimpleGeo https://github.com/simplegeo/SGAREnvironment but it's more of a location-based AR framework. Any others? Or should I not even include the QR reader and do everything with the AR framework?

A few hints:
OpenCV
OpenCV iOS port
ZXing - Code bar generator (support for QR)
LibDecodeQR (uses OpenCV)

Related

A-Frame: FOSS Options for widely supported, markerless AR?

A-Frame's immersive-ar functionality will work on some Android devices I've tested with, but I haven't had success with iOS.
It is possible to use an A-Frame scene for markerless AR on iOS using a commercial external library. Example: this demo from Zapworks using their A-Frame SDK. https://zappar-xr.github.io/aframe-example-instant-tracking-3d-model/
The tracking seems to be no where near as good as A-Frame's hit test demo (https://github.com/stspanho/aframe-hit-test), but it does seem to work on virtually any device and browser I've tried, and it is good enough for the intended purpose.
I would be more than happy to fallback to lower quality AR mode in order to have AR at all in devices that don't support immersive-ar in browser. I have not been able to find an A-Frame compatible solution for using only free/open source components for doing this, only commercial products like Zapworks and 8th Wall.
Is there a free / open source plugin for A-Frame that allows a scene to be rendered with markerless AR across a very broad range of devices, similar to Zapworks?
I ended up rolling my own solution which wasn't complete, but good enough for the project. Strictly speaking, there's three problems to overcome with getting a markerless AR experience on mobile without relying on WebXR:
Webcam display
Orientation
Position
Webcam display is fairly trivial to implement in HTML5 without any libraries.
Orientation is already handled nicely by A-FRAME's "magic window" functionality, including on iOS.
Position was tricky and I wasn't able to solve it. I attempted to use the FULLTILT library's accelerometer functions, and even using the readings with gravity filtered out I wasn't able to get a high enough level of accuracy. (It happened that this particular project did not need it)

Developing a custom filter for iOS Augmented Reality App

Has anyone worked with developing a custom filter for Augmented Reality on iOS Apps in Swift? I am wanting to create a very specific look of a filter for iOS App that blends AR on top of the existing surroundings.
ie. Winter Wonderland theme (snowing, snow on the ground and the buildings around the user)
What's the best way to approach this?
To do this you have to use a computer vision algorithm called SLAM (Simultaneous Localization and Mapping). There are multiple SDKs online that offer this for iOS in Swift, such as: KudanCV (https://www.kudan.eu/download-kudan-cv-sdk/) and ARToolKit (https://artoolkit.org/download-artoolkit-sdk).
However, if you want to develop your own SLAM algorithm I'd recommend looking more into LSD-SLAM(link in comment) or ORB SLAM(link in comment).
Also there's an iOS port for ORB SLAM (link in comment)
I hope that helped.

codename one augmented reality

I'd like to port a small android app which uses camera and draws some infos on the screen while pointing at a specific location. My intention is to port it on iOS. I've found that codename one could be suitable but I don't know if it supports the use of the camera. Could anyone help me? Thanks.
Codename One still doesn't support some of the more elaborate AR API's introduced by Google/Apple but we do support placing a camera view finder into your app with a new cn1lib: https://github.com/codenameone/CameraKitCodenameOne
Original answer below for reference:
It supports the use of the camera but currently doesn't support AR use cases where you draw on the camera viewfinder. You can do that with native code support but since I understand that this is the main feature of the app its probably not the ideal solution.
If its just a small feature within a large app then native interfaces make sense.

Augmented Reality, Move 3d model respective to device movement

I am working on augmented reality app. I have augmented a 3d model using open GL ES 2.0. Now, my problem is when I move device a 3d model should move according to device movement speed. Just like this app does : https://itunes.apple.com/us/app/augment/id506463171?l=en&ls=1&mt=8. I have used UIAccelerometer to achieve this. But, I am not able to do it.
Should I use UIAccelerometer to achieve it or any other framework?
It is complicated algorithm rather than just Accelerometer. You'd better use any third party frameworks, such as Vuforia, Metaio. That would save a lot of time.
Download and check a few samples apps. That is exactly what you want.
https://developer.vuforia.com/resources/sample-apps
You could use Unity3D to load your 3D model and export XCODE project. Or you could use open GL ES.
From your comment am I to understand that you want to have the model anchored at a real world location? If so, then the easiest way to do it is by giving your model a GPS location and reading the devices' GPS location. There is actually a lot of research going into the subject of positional tracking, but for now GPS is your best (and likely only) option without going into advanced positional tracking solutions.
Seeing as I can't add comments due to my account being too new. I'll also add a warning not to try to position the device using the accelerometer data. You'll get far too much error due to the double integration of acceleration to position (See Indoor Positioning System based on Gyroscope and Accelerometer).
I would definitely use Vuforia for this task.
Regarding your comment:
I am using Vuforia framework to augment 3d model in native iOS. It's okay. But, I want to
move 3d model when I move device. It is not provided in any sample code.
Well, it's not provided in any sample code, but that doesn't necessarily mean it's impossible or too difficult.
I would do it like this (working on Android, C++, but it must be very similar on iOS anyway):
locate your renderFrame function
simply do your translation before actual DrawElements:
QCARUtils::translatePoseMatrix(xMOV, yMOV, zMOV, &modelViewProjectionScaled.data[0]);
Where the data for the movement would be prepared by a function that reads them from the accelerometer as a time and acceleration...
What I actually find challenging is to find just the right calibration for a proper adjustment of the output from the sensor's API, which is a completely different and AR/Vuforia unrelated question. Here I guess you've got a huge advantage over Android devs regarding various devices...

Open source augmented reality framework for BlackBerry

Anyone know any open source framework for augmented reality in BlackBerry or a good tutorials for creating an augmented reality application from scratch?
Here is an interface prototype for the free LayarPlayer for third party BlackBerry7 apps: https://gist.github.com/1219438. Not sure if Wikitude will have a lib or not.
If you wanna roll your own AR lib (not recommended, unless you have tons of time and energy) OpenGL ES is platform independent, just use ComponentCanvas for overlaying it on top of the camera view.
BlackBerry OS 7 SDK apparently includes APIs to assist in developing augmented reality applications.
I am working on an OpenGL application for BlackBerry and I too have realised there are not many OpenGL tutorials for it. But you can always use Android ones. They are not really very different.
And I think we should profit from the new BlackBerry graphics card and CPU to create some exciting 3D application for the patform.
You can find OpenGL basic samples on the BlackBerry website and in the BlackBerry SDK.
Notice: All BlackBerry devices that run on Os 7 have a dedicated graphics card and 1.2ghz of CPU frequency.

Resources