Increase tap area of GMSPolyline - ios

,In our app, users can create GMS- features on a map. Those features are then editable, with the editing process starting from a tap. In some cases, if the style applied to a polyline is a 1 or 2 point line, they are difficult for a user to tap. I have researched this and found nothing on how to increase the tap area.
I would like to do something like adding a buffer around the line, such as in this example, but the buffer would not be visible to the user as it would only provide an increased tap area:
Does anyone know if this is possible? Any good resource on how to do this? Thanks!

If you want to know if the user tapped a location that is within a certain distance from a GMSPath you can use GMSGeometryIsLocationOnPathTolerance .
Using this combined with the mapView:didTapAtCoordinate GMSMapViewDelegate method you're all set.

Related

Swift - Detect the user drawing through defined points/areas.

My app already has the drawing enabled for the user on the screen. How can I define certain points in the screen and detect when the user draws through that point? Read about a few swift methods but can't quite grasp if they are applicable for what I need, also I can't find any "collision" methods.
You can use the .containspoint method. However I would recommend you to use a rectangle and not a point. It`s very difficult to draw over one certain point. So you could use CGRect for a rectangle and then again .containspoint(from the user touched point).

iOS: Segmented touchmap over image

I've been trying to find a way to solve this problem, and haven't been able to find anything useful, so forgive me if this is a duplicate of something I couldn't find.
I have, essentially, a large complicated image in the style of a stained glass window in a scroll view so that I can pan and zoom around it. Each of the individual segments of the window has some information associated with it. What I need to be able to do is tap on any of the segments and determine which segment was tapped so that I can display the information. What I'm not sure of is how to do the mapping between touch points and segments. Most of the segments aren't even regular polygon shapes let alone orthogonal squares, so I can't think of a straightforward way to determine which segment I've tapped.
If anybody has any ideas as to how I might go about implementing this, it would be most appreciated!
Cheers
Put each individual segment in a different layer. Now you can do hit-testing on what layer was tapped. Your test must be designed so that if a layer was tapped but on a transparent area (i.e. outside its segment), your test will fall through to the next layer behind it. Thus the test will succeed if and when you discover a layer's non-transparent region under the tap. Since it is one segment per layer, the segment is the one corresponding to that layer.

Drawing World Map - Performance & Interaction - iOS

I’d like to use a Shapefile to generate an interactive world map. I was able to import the data and use CG Paths to draw the map into one large view.
The map needs to support panning, zooming and touch interaction. For that, I've created a UIScrollView and placed the MapView (large view with all of the countries drawn) into it.
I need to improve two aspects of it:
Performance / rendering
I have drawn the map much larger than the screen size, in order to make it look reasonable when I zoom in. There are a few problems with this. First, when I'm zoomed out, I need the border stroke/line to be wider so they are visible. When I zoom in, I'd like the stroke to be a thinner. Also, when I zoom in, I can still see that the map is a blurry. I don't want to increase the view size too much.
How can I make the map look crisp when I'm zoomed in? I attempted to redraw the map on zoom in, but it takes far too long. Can I somehow only re render onscreen stuff?
Touch Interaction
I need to be able to have a touch event for every different country.
Possible approach?
I was thinking of trying to separate every country onto it’s own view. That should make touches easy to handle. Then I’m thinking I can possibly redraw the appropriate views that are on screen/zoomed to.
I've played with an app that does something similar ("World Maps"), and I can see that when you pan or zoom, the map is blurry for a second but then becomes clear. What is going on there?
use mapkit and provide custom tiles. dont reinvent the wheel and try to write yet another map framework
I think that you have to create different scaled area image from the big map image. how to say... imagine the google map, how it works... I think that provide many different zoom factor image for every area of the world... then merged display on the screen while user need show it ...
use code implemented the map effect is impossible on current iPhone device, out of the ability of the iOS device

Graph should not be move if not zoomed

I am using core plot to draw graph. Graph moves always whether it is zoomed or not. But I want like it should not move until it is zoomed. In zooming state we should allow panning(moving left/right) to see the whole graph.Any help would be appreciated.
I can think of several ways to do this:
Disable user interaction when zoomed out. Enable it when the user zooms in.
Set the globalXRange and globalYRange on the plot space. The plot won't scroll outside these ranges no matter how far out you zoom.
Use a plot space delegate. Implement the -plotSpace:willChangePlotRangeTo:forCoordinate: delegate method. If the proposed range is outside the desired scrolling range, modify the range to your preferred range before returning it.

Detect touch coordinates on a sinle UIImageview

I have a single imageview with a static country map divided into regions of that country. What I'd like to do is detect the touch location on the image and provide content about the corresponding region. Since region borders are not linear, how can I save each region's area? Do I need several imageviews (or even custom UIButtons with those images) each belonging to one region or is it possible the way I'd like?
This is the first thing that came to my mind so maybe there is a simpler and better way which I'd love to hear about and I couldn't know how to search for this so apologies if there is a duplicate. And of course I'd appreciate the help.
Thanks
One simple & exact way would be to use Ole Begemann's OBShapedButtons - one for each state.
This will allow you to detect exactly which state was selected. Simply put image of each state in separate buttons (with transparent surroundings) and align buttons next to eachother so that state-borders allign one to another.
Buttons will detect the location of the press and if one button was pressed on its transparent region the touch will be passed along until the correct button gets it.
The simplest way to do something like what you want would be to just put some UIButtons over the top of your UIImageView, making their type custom (so they are transparent). Generally fill in the area of each country with UIButtons. If you test your app, you will probably notice that people will touch the center of each country, so I wouldn't worry about getting 100% coverage. Depending on the shape of the countries, one or two square UIButtons would probably be enough.
If you did want to go the 100% coverage route, you could embed each country in a separate UIImageView subclass, and when you detect a touch anywhere, go through each country image and see if the point touched isn't transparent. When you hit that (a non-transparent pixel) you have found the country. See this post for relevant code: How to get the color of a pixel in an UIView?

Resources