Is there an easy way to make in iOS Google Street View GMSPanormaView's camera follow the device's orientation via data from its motion sensors?
If not, has anyone already done it and can share a code snippet that takes data from CoreMotion, maybe manipulates it to create GMSPanoramaCamera, and passes it to the GMSPanoramaView with animateToCamera:animationDuration:?
Any relevant Android code would also be useful.
Upon checking the Maps SDK for iOS:Internal: Street View, there is no built-in function/implementation for the device orientation/ gyroscope sensor.
According to Ziem's answer you can try implement this by yourself(create function). He also give pointers to study the following:
Set the camera orientation point of view
Animate the camera movements
Reference:
Blog
Github
Using a site they have successfully create a function that will let you browse the Google streetview panoramas with your smartphone/tablet like you were inside it, just by moving your phone like a window to the world.
I would like to create an app, that would show navigation to some marker. It would be best if that navigation could be provided inside an app, however I am not sure if that's possible (at least with MapKit). If my app would launch Apple Maps app for navigation, could my app update destination location? The thing is in most cases my destination should change with time. What are my options?
I done some googling, found that Skobbler a SDK for in app navigation, but it increases app size over 100Mb.
I have done this in an app which takes the coordinates from a plist, maps them and then provides directions from the user location to the map destination point selected. Try this link:
http://www.techotopia.com/index.php/Using_MKDirections_to_get_iOS_8_Map_Directions_and_Routes
There are two approaches for showing an app/app suggestion (incase not installed) on the iphone lock screen / app switcher. One is GPS based, in which the IOS decides which app to show as a suggestion. Another is beacon based, in which a particular beacon is identified.
If location services are enabled for multiple apps and say all these apps are also using beacon based approach to show their icons on the lock screen left corner, which app icon will be shown by the IOS?
Since location services are enabled for these apps,and say there is another relevant app which is NOT using beacon based approach (using just the GPS based approach), can IOS give preference to beacon based apps over the GPS based this new app.?
For instance, Estimote’s NYC office is on the same block as an Equinox gym and our phones intelligently and automatically alert us to use that app. It’s super easy and intuitive to open the app while walking into the gym - and in the process, streamline the check-in flow with the gym’s front desk. However, because it solely uses GPS geofences, the accuracy is poor. We actually get the Equinox icon over 1 block away, and there is no control for the brands or stores (in this case Equinox) on how this appears.
Apple's suggestion of apps not installed on the phone based on proximity uses an undocumented technique. While I have verified it uses GPS as an input, I have never been able to confirm that beacons are used at all.
Regardless of whether beacons are used, because this is an undocumented feature, it is unlikely you will find a way to customize the behavior.
AFAIK, Apple has never shared the implementation details of how the lock screen icon AKA "suggested apps" feature works.
However, we did some experiments at Estimote and noticed that being inside a CLRegion (both the "GPS" CLCircularRegion, and CLBeaconRegion work) that an app monitors for via Core Location, consistently makes the app's icon show up on the lock screen. So it seems that both beacons and GPS location fall into the same mechanism that governs the location-based suggestions. (Note that in iOS 9, that's not just the lock screen icon, but also a bar at the bottom of the app switcher.)
Unfortunately, we weren't able to establish what happens if you're inside multiple qualifying CLRegions, belonging to different apps. We suspect it might have something to do with the order in which the apps register regions for monitoring, but were never able to get consistent results.
Furthermore, since this whole behavior is undocumented, Apple can change it at any time. Just something to be aware of.
Side note: handoff always trumps suggested apps.
I am trying to use the google maps api so that I can view the world in a daylight mode/nighttime mode in realtime depending on sunrise/sunset when the user zooms in on a certain region.
I have yet to have any luck, and would really appreciate any advice/tips!
As of right now, Google Maps for iOS SDK nighttime mode is not a feature.
You can track its progress over issue tracker.
I am developing an iOS app. It can provide turn by turn navigation. With MKDirectionsRequest, code can get navigation routes from device's current location to the destination. CoreLocation updates the device current location coordinates.
How can I check if the device follows the direction routes after the device starts to move?
Any API can change viewing map perspective from top view to street view in MKMapView?
Normal GPS provides this feature when the driver drives to local after off the highway.