I'm working on an app that tries to keep two MKMapView's synchronized with respect to scale. I spent a few days debugging on the iOS simulator, and was getting increasingly frustrated that attempts to set a map view's scale, whether by setting the region or the MapRect, yielded results wildly different than I expected.
When I tried the app on the most convenient iOS device at hand (iPad mini), MapKit was working mostly as expected and I was able to resolve the remaining nuances quickly. At this point, I can get both maps on the device to display identical areas (down to 10m or less in each dimension); on the simulator, setting a map's scale yields a result sometimes off as much as 2x the scale expected.
Has anybody else experienced this disparity between the simulator and the device? If so, any explanation?
Thanks in advance.
the scale of mapkit cannot be set accuratley, neither for one view, nor for both.
At least this is valid before ios 6.
The reason is that mapkit zooms to the next suitable google resoultion, if you want lets say a scale 5% bigger than the next google map, it will still snap to the google resol.
so up to and including ios 5 it is not possible to programatically zoom to an exact value. (i saw no post that mentioned the behavior in ios 6 apple maps)
So in your case, one view could match one of the 16 google zoom levels, while the other view falls in another zoom level.
Related
The problem I'm facing is similar and closely related to this issue on Github but that's for Unity SDK, my question is for iOS SDK.
I want to achieve the same thing. Let me explain, basically I have pixel grid in which each pixel'd have equal size. Pixel is set to be 10m x 10m in real world. The thing I experienced is that if pixel locates towards the northern or southern part of the world, its size is stretched like the following.
Click for larger resolution
But when such pixel locates along the equator line, or simply along the middle part of the world. It looks ok like following
Click for larger resolution
There's no problem about rendering stuff, or positioning on Mapbox. The thing is I want every pixel to be square visually.
I've read along on the issue I linked above. It relates to mercator and the world is not flat thus makes this visual happens. It looks stretched along the northern and southern part of world map. As well, I found out that there's no equal functionalities as presented in Unity SDK for this particular problem on iOS SDK, so I'm not sure which approach I should go on to solve this solution.
How can I achieve equal size of pixel on the gridline on mapbox using Mapbox iOS SDK? Is there already solutions provided in the SDK?
FYI.
My requirement also needs real distance as shown on the map. I'm not sure it'd affect the solution as presented in the link I linked above.
I use Mapbox iOS SDK 3.7.6
My initial approach is straightforward as I fix the size of pixel to be 10m x 10m, then calculate its corresponding latitude and longitude value. Use those values to position them in Mapbox treating entire world map as a tilemap. Anyway I didn't take into account mercator in calculation, so this might be the case, if so then how to do just that? Only thing from my checking as available in iOS SDK is MGLMapView's metersPerPoint(atLatitude:). No tile ID system, or Conversions.cs as seen on Unity SDK. So i'm not sure on how to go on and solve this problem.
Update
I managed to solve it and made it work!
I'll come back and post the solution.
My solution is to port sphericalmercator.js to swift, then use it in code. I use a fixed zoom level of level 22 as its visual look is closest to what I need and also before. I went with the approach to at least have it looks visually equal not necessary its physical size.
Thanks to a hint in this answer on how to use sphericalmercator.js.
Anyway from my testing with it, tile size as set when you creating an instance using SphericalMercator seems not to be in effect no matter what value I set. Only zoom level will determine number of tiles across the world map for you. Note that upper-left corner is origin which is 0,0 tile index. Lower zoom level value will generate large tile size, but higher value will generate smaller tile size.
You can take a look as SphericalMercator-swift; the code I ported from origin JS implementation as linked above along with how to use it to get tile index, or bounding box of longitude/latitude in swift code in order to do rendering stuff on top of Mapbox.
I used to test my apps using an iPhone 5S - now I switched to a iPhone SE.
Now I am asking myself why the default form slide transitions still stutter on such a fast device - the animation should always appear smooth if the calculated locations are correct relative to the timeline.
Looking into CommonTransitions I saw that there is a CommonTransitions.TYPE_FAST_SLIDE and wondered if this was the key to smooth transitions, is it?
In Theme Constants in the Codename One Designer under FormTransitionOut however there is no option fastSlide - why is that?
I've wondered the same, although I have only a cheap device to test on.
This may only add fuel to the fire, but have you tried Display.setFramerate(int rate). The docs say the default is an (attempted) rate of 10 (redraws per second).
Maybe it would look smoother with 20?
I'm almost finishing my iOS game written in Swift + SpriteKit.
It's a quite simple game, 30-32 nodes at max. Only 1 thing has physics. The rest is a few animated clouds (around 6). The CPU usage is around 2-3% and max RAM usage of 75-80MB.
Including that I also get frame drops when changing from one scene to another. Why that could be?
(I'm pre-loading all the textures and sounds during game init, and not on the scenes)
When I use the simulator for 5S up to 6S Plus, I don't see any frame drop in there. So that's weird. Looks like it's not my game but my iPhone 6S?
Now, I do also have other games installed on the same device from different developers, and I frequently get random frame drops too. Lags for 2-3 seconds and then comes back to 60fps.
Does anyone know if this is something that's happening after an X iOS update ? or I was even thinking this my be some kind of background service running that's killing my phone. Call it facebook, whatsapp, messenger, etc.
Is there any way I could possibly check on what's going on?
Was this caused by the way that newer versions of SpriteKit are defaulting to Metal render mode as compared to OpenGL mode? For example, do your problems go away when PrefersOpenGL=YES is added to Info.plist? I covered a bit of this performance issue in my blog post about a SpriteKit repeat shader. Note that you should only be testing on an actual iOS device, not the simulator.
I am working on an ios app using ti.map module(titanium),the problem im facing here is when i zoomout of the map page completely the app closes itself.please sugguest how can I solve this issue,is there anything like setting minimum/maximum zoom level for ios to zoomout of the map? ,as soon as possible so as it would help us for releasing the app to stores.
You should use the "zoom" method of the mapview. Referance http://docs.appcelerator.com/platform/latest/#!/api/Titanium.Map.View-method-zoom
A positive value increases the current zoom level and a negative value decreases the zoom level. Each increase in zoom level increases the magnification by a factor of two.
After updating to iOS 6 I have noticed sever performance decreases when panning or zooming a MKMapView with multiple overlays. An app I created has approximately 600 polygon overlays of various colours, and ran lag-free (even on older iOS devices) on iOS 5, now runs extremely laggily (when zooming & panning) on iOS 6, even on the latest devices.
My hunch is that this is due to the fact the device has to actually dynamically create the map (since its vector based) rather than just display tiles onscreen.
Has anyone got any ideas to reduce the lag experienced when panning or zooming the map?
Some extra info, this low frame rate also occurs whilst zooming or panning areas where the overlays are not displayed on screen at all, so it is not to do with the creation of the overlays as they come onscreen.
You can try combining all of your overlays into a single one. This can dramatically boost performance.
The idea is to create an overlay with a bounding box that encompasses all of your polygons. This way your mapView: viewForOverlay will always be called. Create a property for your overlay that holds all of your polygons. Then in the drawMapRect: method of your overlay view, test all of your polygons for intersection with mapRect and draw them if necessary. This is important since you don't want to be drawing polygons that are off screen.
This strategy is based on Apple's own MapKit example projects. Check out HazardMap for an example of drawing several objects in a single MKOverlayView and check out BreadCrumb for an example of how to efficiently test polygons for intersection with your current mapRect in the drawMapRect method
I have a minimalistic MapKit tech demo and it's lagging noticeably as I run it on an iPad 3 with iOS6. Profiling reveals that it's CPU bound, but only 0.2% is from my own code. The big culprits in my case are rendering roads, followed by rendering labels - both done by MapKit. I am showing downtown San Francisco at a 5KM scale, so there are a lot of roads and labels to render.
So the moral of the story is: iOS6 maps are SLOW. Can't tell you how this compares to iOS5 or to an iPad 2, though. But it's lagging, and I am barely doing any work of my own at all.
P.S:
Open Instruments and use the Time Profiler. Make a recording + drill down to find your culprits. Then check 'hide system libraries' to find out how much of the lag is your responsibility vs MapKit's. Then optimize only as needed.