Goodday.
I am using the 0.6 Google Cardboard SDK in Unity 5.3 . When I publish apps to Android device this works perfect and looks crystal sharp when looking through the cardboard (Note4 & S6).
However when I start using the SDK for iOS (iPhone5 & 6S) the images look very blurry when looking through the cardboard (without the cardboard the display looks sharp as with the android). When I make the physical distance between the iphone and lenses the image becomes sharp again. Is there a variable of the Google Cardboard SDK that I should change for iOS?
Looking at it for quite some time now, but nothing seems to fix is and also cannot find anything on the internet about this.
Does somebody know what to do? Thanks in advance for the help!!
PS
As a test I did a clean install of the Google Cardboard Demo Scene. The same problem seems to occur in this app, so definitely is not because of my app.
Pps
In the Android API Reference the following parameter is available:
float getVerticalDistanceToLensCenter()
Is there something like that in the Unity Google Cardboard SDK?
Related
Ok I have no idea what is going on here, cant find any solutions anywhere. Here is what I happens when I try to run this ARKit Unity demo (or any AR demo for that matter) https://github.com/FusedVR/PetAR built to my iPhone -
The UI shows up, but where the camera capture is supposed to be occurring, I just have a blue screen. This is not what happens on their demo video online and it seems no one else has this problem.
I am on Unity 5.6.6, however I was on 2017 before and that did not work either. I made sure I had some text written in my "Camera description" field so the iPhone would allow camera access, and I am out of solutions at this point.
How can I get ARKit to work in Unity deployed to iOS? What am I doing wrong here?
I have the Unity build deploying via Xcode 9 the most recent beta.
There are certain hardware and software requirements in order to run ARKit-based applications.
https://developer.apple.com/arkit/
High Performance Hardware and Rendering Optimizations
ARKit runs on the Apple A9 and A10 processors.
Practically, you need an iPhone 6s or newer.
Introducing ARKit
iOS 11 introduces ARKit, a new framework
iOS 11 is also required.
I'm having an issue with world space UI and google cardboard. The canvas is attached to the character and I can clearly see it in the camera preview and when VR mode is disabled but not when VR is enabled. I've googled around for a couple hours and messed with settings in the cardboard prefabs to no avail. I also noticed that the world space UI in the demo scene that came with cardboard also has the same problem.
Which Unity version are you using?
Have you tried setting Canvas Render Mode to World Space and setting up the Event Camera to Main Camera
If it is still not working for you, you might want to look into the issue that I bumped into.
World Space Canvas UI Element Issue
It has been a known issue from a long time from some Unity 4.* versions that was resolved in 5.* but has not appeared again in 5.3.5 and some other versions.
Downgrading to Unity to 5.3.4f1 helps (According to the blog above).
But if you are someone who'd rather use tweaks than downgrade, then you might want to use Quad in Gameobjects and add a child 3D Text.
I have been doing the same for my Cardboard games.
I am looking into VR using Native swift, found some interesting ways to port google cardboard SDK but want to reach out and see if anyone has experience in this, specifically capturing panoramic images using an iPhone and converting them into VR content.
I have been searching for ways to integrate Google Cardboard SDK in iOS. One way is using unity but i am looking for something through which i can directly integrate the cardboard sdk in ios and i want to view a panoramic image in that. Is there any way to do that?
I am looking for an iOS alternative for this project : Link Here
Okay, I've spent a few days getting CardboardSDK-iOS to do what I want (which is like the "Exhibit" demo in Google Cardboard App), and I'm pretty pleased with it. I'm guessing that it's pretty faithfull to the original, but since I'm not familiar with the original, I can't say for sure.
But I can say that it's not just a case of dropping a panoramic data set in. You need to do a bit of work to display the stereo image pair required, in OpenGL, depending on where the viewer has their head pointing. If you understand 3D transforms, how OpenGL works, and you've got your data prepared correctly, it should not be to onerus to get it working.
Of course - this is all done in xcode in ObjectiveC/C++ - and not in Java. And I'm assuming that by "panoramic image" you mean you have a hemispherical stereo data set which should give you something like what you see in Google's Cardboard "Urban Hike" demo.
Hope this helps !
I have integrated Google Maps SDK to an iOS application and I would like to display 3D Satellite maps. According to the documentation this should work just directly. I can tilt the view, but the displayed map remains flat (i.e. mountains do not show up in 3D as they do in Google Earth).
I have been searching extensively for this, but found no reference or mentioning whether it actually works or does not. Does anybody know whether the 3D maps (google SDK) do work on iOS and I am just hitting some limitation/wrong switch or whether they do not work?
As of SDK v1.8, tilted layers do appear to have some 3D elevation effects, but it's more subtle than Google Earth typically is.