Phonegap IOS plugin - ios

I need a plugin for phonegap that would make it possible to view the video with GPS coordinates and icons layered on top of the video. Sort of like AR. so when turning and moving around you'll pass the GPS coordinate once you're looking in its direction.
Is this possible?

If I understand the issue correctly this is a classic AR with an embed video between the reality background and the markers. As far as I know there is no phonegap plugin at the moment that covers your specific needs, you can see a list of existing phonegap plugins here: github phonegap plugins. How ever you can make your own plugin for your purpose. I would take a look to http://www.layar.com/ and consider embedding it as a project library. See also if the pricing fits with your project.
Another approach could be to get rid of the plugin and just process the GPS and accelerometer information on your own, but the data processing looks quite more complex.

Related

Creating a 360 photo experience on iOS mobile device

I am interested in VR and trying to get a bit more information. I want to create a similar experience on iOS where I can take a 360 image and be able to view it on a iOS device by tilting the phone around and using the devices gyroscope, as I tilt the phone around it will pan around the 360 image (like on google street view where you can use the tilt gesture).
And something similar to this app: http://bubb.li/
Can anybody give a brief overview how this would be do-able, any sources that could help me achieve this, API's etc...?
Much appreciated.
Two options here: You can use a dedicated device to capture the image for you, or you can write some code to stitch together multiple images taken from the iOS device as you move it around a standing point.
I've used the Ricoh Theta for this (no affiliation). They have a 360 viewer in the SDK for mapping 360 images to a sphere that works exactly as you've asked.
Assuming you've figured out how to create 360 photospheres, you can use Unity and Unreal, and probably development platforms to create navigation between the locations you captured.
Here is a tutorial that looks pretty detailed for doing this in Unity:
https://tutorialsforvr.com/creating-virtual-tour-app-in-vr-using-unity/
One pro of doing this in something like Unity or Unreal is once you have navigation between multiple photo spheres working it's fairly easy to add animation or other interactive elements. I've seen interactive stories done with 360 video using this method.
(I see that the question is from a fairly long time ago, but it was one of the top results when I looked for this same topic)

Is it possible to recognise light patterns on iOS?

Is it possible to recognise light patterns on iOS?
Is there a native iOS SDK to do so?
Use case:
Detect light patterns (e.g. on / off) using smartphone camera
Background information:
Apple has acquired last year Metaio so I presume at some point we will have such SDK, but for now I presume that the best way to achieve this is by using third party SDK or using image capturing and processing the image (if the images are simple enough so that a simple algorithm can be applied).
You could take a look at Kudan AR. https://www.kudan.eu/
They currently offer a SDK for iOS, not yet for Android. Their tracking quality is phenomenally good. But, I do not know if it is appropriate to your goals. It would be best if you talk to them and ask if their tracking would fit your needs.

Is it possible to view panoramic image using google cardboard in iOS application?

I have been searching for ways to integrate Google Cardboard SDK in iOS. One way is using unity but i am looking for something through which i can directly integrate the cardboard sdk in ios and i want to view a panoramic image in that. Is there any way to do that?
I am looking for an iOS alternative for this project : Link Here
Okay, I've spent a few days getting CardboardSDK-iOS to do what I want (which is like the "Exhibit" demo in Google Cardboard App), and I'm pretty pleased with it. I'm guessing that it's pretty faithfull to the original, but since I'm not familiar with the original, I can't say for sure.
But I can say that it's not just a case of dropping a panoramic data set in. You need to do a bit of work to display the stereo image pair required, in OpenGL, depending on where the viewer has their head pointing. If you understand 3D transforms, how OpenGL works, and you've got your data prepared correctly, it should not be to onerus to get it working.
Of course - this is all done in xcode in ObjectiveC/C++ - and not in Java. And I'm assuming that by "panoramic image" you mean you have a hemispherical stereo data set which should give you something like what you see in Google's Cardboard "Urban Hike" demo.
Hope this helps !

Is is possible to use Vuforia without a camera?

Is it possible to use Vuforia without a camera for image tracking?
Basically I would like a function I could call with an image as a indata parameter and coordinates of a image target as a result. Does that exist?
It is unfortunately not possible. I've been looking for such an option myself several times while working on a Moodstocks (image recognition SDK) / Vuforia mashup (see these 2 blog posts if you are interested in it), but the Vuforia SDK prevents the use of any other source than the camera.
I guess the main reason for this is that the camera management is fully handled internally by the Vuforia SDK, probably in order to make it easier to use as managing the camera by ourselves is at best a boring task (lines and lines of code to repeat in each project...), at worst a huge pain in the ass (especially on Android where there are sometimes devices than don't behave as expected).
By the way, it looks to me like the Vuforia SDK is not the best solution you can find for your use case: it is mainly an augmented-reality SDK, focussed on real-time tracking, which imply working with a camera stream... so using it to do "simple" image recognition looks really overkill!

Animated 2d Map Desktop Widget

I need to animate some vector icons smoothly moving about a 2d map. I have time-lat/lng pairs forming tracks. Down the road I would really like to be able to convey various GIS data like topography and roads on the map along with my smoothly animated icons.
Any suggestions on what to use? I find things like Quantum GIS but it seems geared to generating static maps. I've tried messing around with KML but I cannot find any way to make things move smoothly: marker icons clearly bounce along the waypoints even when I space them very closely.
EDIT: clarified I'm interested in a desktop widget
Animation options are limited in GIS as far as I am aware.
ESRI's ArcObjects could be used to create animations - see this chapter in the online help:
http://webhelp.esri.com/arcgisdesktop/9.3/index.cfm?TopicName=An_overview_of_animation
and these examples (however none have vectors moving around):
http://webhelp.esri.com/arcgisdesktop/9.3/index.cfm?TopicName=Sample_animation_videos
ESRI software is expensive to purchase, and users would also need the software if you wanted to provide more than an exported video.
You are probably best working with WPF (is this widget for Windows?), Silverlight, or Flash. ESRI have a Silverlight example here:
http://www.codeproject.com/KB/showcase/GIS_Silverlight.aspx
There is also the following collection of WPF classes for the OpenSource SharpMap:
http://wpfsharpmapcontrols.codeplex.com/
However it seems very much in beta at this stage.
Alternatively it may be easier to use GIS software solely to provide a background image, and do all the animation elsewhere.
I would say, try this animation code for Google Earth; however, try emailing the osgeo or qgis userlists and they'll guide you

Resources