Display 3d object via VR glasses with user interaction [closed] - ios

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
I have a 3D object displayed on the iPhone screen, I need to display it via VR glasses and the user can interact with the object (zoom in/out).
-I need help how doing this using VR glasses without using unity.
Thanks

If you don't want to use Unity, then you will have to roll this out mostly on your own. There are some example projects that show how to create basic VR experiences on iOS (including one I made using SceneKit).
If you want to make something more serious, and are not looking to just experiment, then I'd highly recommend using Unity since you can then use the latest Google Cardboard SDK. This will give you much better results since it will handle all of the camera/view aspects for you.
You can use a Google Cardboard for you headset, at least that is an initial inexpensive option. There is a magnetic "trigger" on the Cardboard that you can use for binary input in you app. My example project also includes a class that handles input from this trigger by detecting the disruption to the magnetic field around the device's compass. You can use this trigger to handle the zoom in/out.
Good luck!

Related

Embed Augmented Reality into a website [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I was just wondering if anyone knows whether it is possible to embed augmented reality into a website. So for instance, if I was to create a space invaders style game could I use put a placeholder image onto a website that comes to life and starts playing the game outside/around the computer when the users holds their phone up to the screen?
OR
Create a website designed to be accessed on your mobile. And then when the user access the website, it accesses (with their permission) the users camera to turn their environment into the space invaders game. Critically I don't want to force the user to download anything but I can't find anything to support whether this is possible or not.
Many thanks!
You can use WebGL+Vuforia for building Augmented Reality for a web page. For example like this.
Moreover, you can check this related medium post. It can be helpful for you. This project was developed by a Javascript framework for Augmented Reality.

What is the best Augmented Reality Environment to use to develop mobile app? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
I need to develop a prototype of an augmented reality app for research purpose.
This is my first time with augmented application and I only have basic knowledge on android.
The application is an android application that uses video overlay to display over the scene after detecting a target. The only problem is that the videos are obtained from another user (through another application that will allow user to record a video). The tutorials and examples I found allow you to attach the video to the target by developer not the the user and that's not my case.
Since either way I have to invest sometime to learn how to develop it. Whats you recommended environment? Android studio with **Vuforia. Vuforia with unity ? or other SDK? and I would appreciate if you have a slimier tutorial and samples.
It just a prototype therefore I'm not looking for high quality. Easy and less time consuming is my interest.
Unity 3D with Vuforia is easier for a beginner to understand. For more advanced functions you might want to use the Android Studio. I am not sure about how the user can attach a video to the target since normally, the developer has to include the video file in the Assets folder to attach it to target. This is how it is done in Unity. I hope someone can give you a better insight into this. But, I have worked with Unity/Vuforia and it is a pretty comfortable and easy environment for a beginner. All the best for your prototype :)

Can we make an app that uses the camera and make a call at the same time? [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
I know I can't record a video while I'm on my phone in general and that if the capability existed or could exist that it would be a feature/option.
Are there apps that do this and if it can be done can someone help me get started code wise?
EDIT
Because there seems to be some confusion as to what I'm asking, I will clarify:
Are we able to launch phone calls using the iPhone regular calling feature and use the iPhone regular movie creating feature (the video camera) at the same time in an APP - as we know it can't be done generally.
Yes and no. You can't use Apple's built-in camera and phone to do it. But, obviously there are many apps (i.e. Skype and others) that use both camera and voice features in their VoIP implementations. So, it would take a custom camera and VoIP to accomplish this.

Augmented reality to measure - iPhone [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
This is my first time working on Augmented Reality topic. I am about to develop an application which uses augmented reality to recognise and measure the n number of objects in my room. Something like in the attached image. I need to identify each edges and corners and have to do some vast math calculation for the measurement. I am pretty sure that it cant be achieved only through iOS SDK, I need to use some external library/SDKs. I need some scanner SDK which does the real time image recognition.
I came across Qualcomm's Vuforia, Realitycap, metaIO. My dilemma is, a developer who worked in product and business based iOS application can do this image recognition stuff? An iOS developer does not have that awesome experience in image processing. Can anyone suggest me some cool stuff to come over? Suggest some ideas also, it will help me a lot.
You can use metaio sdk to scan different objects. You can create as many object models in unity3D and have in database. The above sdk helps you to do in 3D scanning and many more features.
I think there is a possibility to use OpenCV
API can be found at: http://opencv.org

RubyMotion for game development [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
How suitable RubyMotion for iOS game development?
I was not able to find CoreAnimation topics in documentation, but as I've heard someone was able to use cocos-2d with RubyMotion.
If you have some useful information which you are willing to share, I would greatly appreciate this!
If you are willing to program your game completely programmatically, RubyMotion is a fine choice. There is no appreciable drop in performance and every C library and API is available to RubyMotion. Using motion-cocoapods you're even able to include CocoaPods and you can also include Objective-C libraries.
The one issue you may run into is a lack of RubyMotion-specific tutorials and documentation for games.
EDIT: I successfully (with help) recreated the Sparrow game engine demo in RubyMotion:
https://github.com/jamonholmgren/demo-sparrow
It runs beautifully.
You can try JoyBox it combines popular 2D game engine for iOS, Cocos2D, folds in with Box2D physics engine, and wraps into Ruby API. Joybox can be found at this link on github.

Resources