Fancy Effects on MKMapOverlay CGPath - ios

I'm using a MKOverlayView for drawing a path onto the apple maps. I'd like to draw many short paths onto it, because I need to colorize the track depending on some other values. But I'm getting some fancy effects doing it that way ... also my start- and ending points are connected, but I don't know why. After zooming in/out the fancy-effect-pattern changes and gets bigger/smaller. It seems that you can see the apple map tiles on my path ...
This is my code, its called inside the drawMapRect method of my overlay view.
for(int i = 0; i < tdpoints.pointCount-1; i++ ){
CGPoint firstCGPoint = [self pointForMapPoint:tdpoints.points[i]];
CGPoint secCGPoint = [self pointForMapPoint:tdpoints.points[i+1]];
if (lineIntersectsRect(tdpoints.points[i], tdpoints.points[i+1], clipRect)){
double val1 = (arc4random() % 10) / 10.0f;
double val2 = (arc4random() % 10) / 10.0f;
double val3 = (arc4random() % 10) / 10.0f;
CGContextSetRGBStrokeColor(context, val1 ,val2, val3, 1.0f);
CGContextSetLineWidth(context, lineWidth);
CGContextBeginPath(context);
CGContextMoveToPoint(context,firstCGPoint.x,firstCGPoint.y);
CGContextAddLineToPoint(context, secCGPoint.x, secCGPoint.y);
CGContextStrokePath(context);
CGContextClosePath(context);
}
}
http://imageshack.us/photo/my-images/560/iossimulatorbildschirmf.jpg/
http://imageshack.us/photo/my-images/819/iossimulatorbildschirmf.jpg/
I'm adding my GPS Points like that. (From Breadcrumbs Apple Example)
CLLocationCoordinate2D coord = {.latitude = 49.1,.longitude =12.1f};
[self drawPathWithLocations:coord];
CLLocationCoordinate2D coord1 = {.latitude = 49.2,.longitude =12.2f};
[self drawPathWithLocations:coord1];
CLLocationCoordinate2D coord2 = {.latitude = 50.1,.longitude =12.9f};
[self drawPathWithLocations:coord2];
This is the adding Method:
-(void) drawPathWithLocations:(CLLocationCoordinate2D)coord{
if (!self.crumbs)
{
// This is the first time we're getting a location update, so create
// the CrumbPath and add it to the map.
//
_crumbs = [[CrumbPath alloc] initWithCenterCoordinate:coord];
[self.trackDriveMapView addOverlay:self.crumbs];
// On the first location update only, zoom map to user location
[_trackDriveMapView setCenterCoordinate:coord zoomLevel:_zoomLevel animated:NO];
} else
{
// This is a subsequent location update.
// If the crumbs MKOverlay model object determines that the current location has moved
// far enough from the previous location, use the returned updateRect to redraw just
// the changed area.
//
// note: iPhone 3G will locate you using the triangulation of the cell towers.
// so you may experience spikes in location data (in small time intervals)
// due to 3G tower triangulation.
//
MKMapRect updateRect = [self.crumbs addCoordinate:coord];
if (!MKMapRectIsNull(updateRect))
{
// There is a non null update rect.
// Compute the currently visible map zoom scale
MKZoomScale currentZoomScale = (CGFloat)(self.trackDriveMapView.bounds.size.width / self.trackDriveMapView.visibleMapRect.size.width);
// Find out the line width at this zoom scale and outset the updateRect by that amount
CGFloat lineWidth = MKRoadWidthAtZoomScale(currentZoomScale);
updateRect = MKMapRectInset(updateRect, -lineWidth, -lineWidth);
// Ask the overlay view to update just the changed area.
[self.crumbView setNeedsDisplayInMapRect:updateRect];
}
}
This is the addCoordinate method:
- (MKMapRect)addCoordinate:(CLLocationCoordinate2D)coord
{
pthread_rwlock_wrlock(&rwLock);
// Convert a CLLocationCoordinate2D to an MKMapPoint
MKMapPoint newPoint = MKMapPointForCoordinate(coord);
MKMapPoint prevPoint = points[pointCount - 1];
// Get the distance between this new point and the previous point.
CLLocationDistance metersApart = MKMetersBetweenMapPoints(newPoint, prevPoint);
NSLog(#"PUNKTE SIND %f METER AUSEINANDER ... ", metersApart);
MKMapRect updateRect = MKMapRectNull;
if (metersApart > MINIMUM_DELTA_METERS)
{
// Grow the points array if necessary
if (pointSpace == pointCount)
{
pointSpace *= 2;
points = realloc(points, sizeof(MKMapPoint) * pointSpace);
}
// Add the new point to the points array
points[pointCount] = newPoint;
pointCount++;
// Compute MKMapRect bounding prevPoint and newPoint
double minX = MIN(newPoint.x, prevPoint.x);
double minY = MIN(newPoint.y, prevPoint.y);
double maxX = MAX(newPoint.x, prevPoint.x);
double maxY = MAX(newPoint.y, prevPoint.y);
updateRect = MKMapRectMake(minX, minY, maxX - minX, maxY - minY);
}
pthread_rwlock_unlock(&rwLock);
return updateRect;
}
Hint
I think my refresh algorithm only refreshes one tile of the whole map on the screen and because every time the drawMapRect method is called for this specific area a new random color is generated. (The rest of the path is clipped and the oder color remains ...).

The "fancy effects" you see are a combination of the way MKMapView calls drawMapRect and your decision to use random colours every time it is draw. To speed up display when the user pans the map around MKMapView caches tiles from your overlay. If one tile goes off screen it can be thrown away or stored in a different cache or something, but the ones still on screen are just moved about and don't need to be redrawn which is good because drawing might mean a trip to your data supply or some other long calculation. That's why you call setNeedsDisplayInMapRect, it only needs to fetch those tiles and not redraw everything.
This works in all the apps I've seen and is a good system on the whole. Except for when you draw something that isn't going to be the same each time, like your random colours. If you really want to colour the path like that then you should use a hash or something that seems random but is really based on something repeatable. Maybe the index the point is at, multiplied by the point coordinate, MD5ed and then take the 5th character and etc etc. What ever it is it must generate the same colour for the same line no matter how many times it is called. Personally I'd rather the line was one colour, maybe dashed. But that's between you and your users.

because whenever you draw any path you need to close it. and as you close the path it automatically draws line between lastPoint and firstPoint.
just remove last line in your path drawing
CGContextClosePath(context);

The purpose of CGContextClosePath is to literally close the path - connect start and end points. You don't need that, StrokePath drew the path already. Remove the line. Also move CGContextStrokePath outside your loop, the approach is to move/add line/move/add line... and then stroke (you can change colors as you do this, which you are).
For the "fancy effects" (tilted line joining), investigate the effects of possible CGContextSetLineJoin and CGContextSetLineCap call parameters.

Related

How to attach sprites that collide?

I essentially want the "sprites" to collide when they stick together. However, I don't want the "joint" to be rigid; I essentially want the sprites to be able to move around as long as they are in contact with each other. Imagine two circles connected, and you can move one circle around the other, as long as it remains in contact.
I found this question: How to make one body stick to another moving object in SpriteKit and a lot of other resources that explain how to make sprites stick upon collision, but they all use SKJoints, which are rigid are not really flexible.
I guess another way to phrase it would be to say that I want the sprites to stick, but I want them to be able to "slide" on each other.
Well, I can think of one workaround, but this wouldn't work with non-normal polygons.
Sticking (pun unintended) with your circles example, what if you lock the position of the circle?
let circle1 = center circle
let circle2 = movable circle
Knowing the width of both circles, you can place in the update function that the position should be exactly the distance of:
((circle1.frame.width / 2) + (circle2.frame.width / 2))
If you're up to it, here's some code to help you on your way.
override func update(currentTime: CFTimeInterval) {
{
let distance = hypotf(Float(circle1.position.x - circle2.position.x), Float(circle1.position.y - circle2.position.y))
//calculate circle distances from each other
let radius = ((circle1.frame.width / 2) + (circle2.frame.width / 2))
//distance of circle positions
if distance != radius
{
//if distance is less or more than radius
let pointA = circle1.position
let pointB = circle2.position
let pointC = CGPointMake(pointB.x + 2, pointB.y)
let angle_ab = atan2(pointA.y - pointB.y, pointA.x - pointB.x)
let angle_cb = atan2(pointC.y - pointB.y, pointC.x - pointB.x)
let angle_abc = angle_ab - angle_cb
//get angle of circles from each other using atan2
let vectorx = cos(angle_abc)
let vectory = sin(angle_abc)
//convert angle into vectors
let x = circle1.position.x + radius * vectorx
let y = circle1.position.y + radius * vectory
//get new coordinates from vector, radius and center circle position
circle2.position = CGPointMake(x, y)
//set new position
}
}
Well you need to write code to make sure the movable circle, is well movable.
But, this should work.
I haven't tested this yet though, and I haven't even learned geometry let alone trig in school yet.
If I'm reading your question as you intended it, you can still use joints- just create actions with Inverse Kinematic constraints that allow rotation and translation around the contacting circles' joint.
https://developer.apple.com/library/prerelease/ios/documentation/SpriteKit/Reference/SKAction_Ref/index.html#//apple_ref/doc/uid/TP40013017-CH1-SW72

Check if node is visible on the screen

I currently have a large map that goes off the screen, because of this its coordinate system is very different from my other nodes. This has led me to a problem, because I'm needing to generate a random CGPoint within the bounds of this map, and then if that point is frame/on-screen I place a visible node there. However the check on wether or not the node is on screen continuously fails.
I'm checking if the node is in frame with the following code: CGRectContainsPoint(self.frame, values) (With values being the random CGPoint I generated). Now this is where my problem comes in, the coordinate system of the frame is completely different from the coordinate system of the map.
For example, in the picture below the ball with the arrows pointing to it is at coordinates (479, 402) in the scene's coordinates, but they are actually at (9691, 9753) in the map's coordinates.
I determined the coordinates using the touchesBegan event for those who are wondering. So basically, how do I convert that map coordinate system to one that will work for the frame?
Because as seen below, the dot is obviously in the frame however the CGRectContainsPoint always fails. I've tried doing scene.convertPoint(position, fromNode: map) but it didn't work.
Edit: (to clarify some things)
My view hierarchy looks something like this:
The map node goes off screen and is about 10,000x10,000 for size. (I have it as a scrolling type map). The origin (Or 0,0) for this node is in the bottom left corner, where the map starts, meaning the origin is offscreen. In the picture above, I'm near the top right part of the map. I'm generating a random CGPoint with the following code (Passing it the maps frame) as an extension to CGPoint:
static func randPoint(within: CGRect) -> CGPoint {
var point = within.origin
point.x += CGFloat(arc4random() % UInt32(within.size.width))
point.y += CGFloat(arc4random() % UInt32(within.size.height))
return point;
}
I then have the following code (Called in didMoveToView, note that I'm applying this to nodes I'm generating - I just left that code out). Where values is the random position.
let values = CGPoint.randPoint(map.totalFrame)
if !CGRectContainsPoint(self.frame, convertPointToView(scene!.convertPoint(values, fromNode: map))) {
color = UIColor.clearColor()
}
To make nodes that are off screen be invisible. (Since the user can scroll the map background). This always passes as true, making all nodes invisible, even though nodes are indeed within the frame (As seen in the picture above, where I commented out the clear color code).
If I understand your question correctly, you have an SKScene that contains an SKSpriteNode that is larger than the scene's view, and that you are randomly generating coordinates within that sprite's coordinate system that you want to map to the view.
You're on the right track with SKNode's convertPoint(_:fromNode:) (where your scene is the SKNode and your map is the fromNode). That should get you from the generated map coordinate to the scene coordinate. Next, convert that coordinate to the view's coordinate system using your scene's convertPointToView(_:). The point will be out of bounds if it is not in view.
Using a worldNode which includes a playerNode and having the camera center on this node, you can check on/off with this code:
float left = player.position.x - 700;
float right = player.position.x + 700;
float up = player.position.y + 450;
float down = player.position.y - 450;
if((object.position.x > left) && (object.position.x < right) && (object.position.y > down) && (object.position.y < up)) {
if((object.parent == nil) && (object.dead == false)) {
[worldNode addChild:object];
}
} else {
if(object.parent != nil) {
[object removeFromParent];
}
}
The numbers I used above are static. You can also make them dynamic:
CGRect screenRect = [[UIScreen mainScreen] bounds];
CGFloat screenWidth = screenRect.size.width;
CGFloat screenHeight = screenRect.size.height;
Diving the screenWidth by 2 for left and right. Same for screenHeight.

Cannot figure out correct vertexZ in Isometric Tiled maps when creating sprites

I have a 50 X 50 isometric Tiled Map with base tile : 64 X 32.
I am using this function to create a sprite and add to a particular tile dynamically.
-(void)addTile:(NSString *)tileName AtPos:(CGPoint)tilePos onTileMap:(CCTMXTiledMap *)tileMap
{
CCTMXLayer *floorLayer=[tileMap layerNamed:#"FloorLayer"];
NSAssert(floorLayer !=nil, #"Ground layer not found!");
CGPoint tilePositionOnMap = [floorLayer positionAt:tilePos];
CCSprite *addedTile = [[CCSprite alloc] initWithFile:tileName];
addedTile.anchorPoint = CGPointMake(0, 0);
addedTile.position = tilePositionOnMap;
addedTile.vertexZ = [self calculateVertexZ:tilePos tileMap:tileMap];
[tileMap addChild:addedTile];
}
Floor layer is the only layer in my Tiled map and I have added the property cc_vertexz = -1000 to this layer.
I took the calculateVertexZ method from the KnightFight project. Based on the tile coordinates on isometric map it'll calculate the vertexZ and once you see the map it seems to make sense too.
-(float) calculateVertexZ:(CGPoint)tilePos tileMap:(CCTMXTiledMap*)tileMap
{
float lowestZ = -(tileMap.mapSize.width + tileMap.mapSize.height);
float currentZ = tilePos.x + tilePos.y;
return (lowestZ + currentZ + 1);
}
Now this is the code which i'm writing in the -init of my HelloWorldLayer of template cocos2d-2 project. -
self.myTileMap = [CCTMXTiledMap tiledMapWithTMXFile:#"IsometricMap.tmx"];
[self addChild:self.myTileMap z:-100 tag:TileMapNode];
[self addTile:#"walls-02.png" AtPos:CGPointMake(0, 0) onTileMap:self.myTileMap];
[self addTile:#"walls-02.png" AtPos:CGPointMake(0, 1) onTileMap:self.myTileMap];
[self addTile:#"walls-02.png" AtPos:CGPointMake(4, 1) onTileMap:self.myTileMap];
[self addTile:#"walls-02.png" AtPos:CGPointMake(4, 0) onTileMap:self.myTileMap];
Here's the wall image -
And here's the issue -
Case 1
(0,0) should be behind (0,1) according to calculateVertexZ method. And hence the sprite on ( (0,1) is rendered OVER sprite on (0,0).
Case 2
(4,0) should be behind (4,1) according to calculateVertexZ method. But somehow, because I'm adding block on (4,0) AFTER (4,1) its not giving me the desired results.
I had read that when two sprites have same vertexZ ONLY then whichever sprite was added later will be on top. But here the sprites have different vertexZ values, still order of creation is overriding that.
Also, I can't figure out what to do with zorder in this equation. SOMEBODY PLS HELP
I solved this by using the vertexZ property and zOrder property of the base tile.
-(void)addTile:(NSString *)tileName AtPos:(CGPoint)tilePos onTileMap:(CCTMXTiledMap *)tileMap
{
CCTMXLayer *floorLayer=[tileMap layerNamed:#"FloorLayer"];
NSAssert(floorLayer !=nil, #"Ground layer not found!");
CCSprite *baseTile = [floorLayer tileAt:tilePos];
CCSprite *addedTile = [[CCSprite alloc] initWithFile:tileName];
addedTile.anchorPoint = CGPointMake(0, 0);
addedTile.position = baseTile.position;
addedTile.vertexZ = baseTile.vertexZ;
[tileMap addChild:addedTile z:baseTile.zOrder tag:tile.tag];
}

MKMapRect and displaying map overlays that span 180th meridian

I am working with viewports and bounds returned from Google Geocoding API. When doing reverse geocoding for a given coordinate, the service returns several results with various granularity (country, administrative area, locality, sublocality, route, etc.). I want to select the most appropriate on the results given the current visible area on the map.
I've settled on algorithm that compares the ratios of areas (in MKMapPoint²) of the location viewport, current map viewport and their intersection (using MKMapRectIntersection function). This works very well as long as the location viewport does not span the 180 meridian. In that case their intersection is 0.
I've started to investigate the cause and as debugging aid I do display MKPolygon overlays on the map to give me visual clues as to what is going on. To avoid possible errors introduced by my code that does conversion between geo-coordinates and MKMapRect, I have constructed the polygon overlay using original coordinates from Google results like this:
CLLocationCoordinate2D sw, ne, nw, se;
sw = location.viewportSouthWest.coordinate;
ne = location.viewportNorthEast.coordinate;
nw = CLLocationCoordinate2DMake(ne.latitude, sw.longitude);
se = CLLocationCoordinate2DMake(sw.latitude, ne.longitude);
CLLocationCoordinate2D coords[] = {nw, ne, se, sw};
MKPolygon *p = [MKPolygon polygonWithCoordinates:coords count:4];
For example of problematic location, here is the viewport returned for United States, last result of type country, when geocoding coordinates somewhere in Virginia:
Southwest: 18.9110643, 172.4546967
Northeast: 71.3898880, -66.9453948
Notice how the southwest coordinate, which is in the lower left corner of the location viewport lies across the 180 meridian. When displaying this location overlayed as polygon on the map it displays incorrectly to the right of USA borders (big brown rectangle, only lower left corner visible):
Similarily, displaying location viewport for Russia shows the rectangle positioned incorrectly to the left of the border of Russia.
This visually confirms there is similar problem going on, when I convert the location viewport to MKMapPoints and MKMapRect and find no intersection between the map viewport (white rectangle in the picture above) and the location viewport.
The way I compute the map rect is similar to answers in this SO question:
How to fit a certain bounds consisting of NE and SW coordinates into the visible map view?
...which works fine unless the coordinates span the 180th meridian. Testing the MKMapRect with MKMapRectSpans180thMeridian return false, so that construction method is incorrect.
Apple documentation is not helpful in this regards. Only hint I've found is in MKOverlay.h:
// boundingMapRect should be the smallest rectangle that completely contains
// the overlay.
// For overlays that span the 180th meridian, boundingMapRect should have
// either a negative MinX or a MaxX that is greater than MKMapSizeWorld.width.
#property (nonatomic, readonly) MKMapRect boundingMapRect;
What is the correct way to display the polygon overlay that span the 180th meridian?
How to correctly construct MKMapRect that spans 180th meridian?
As this area is woefully under-documented, the Map Kit Functions Reference should be amended with:
Warning: All the described functions work fine, as long as you do not cross the 180th meridian.
Here be dragons. You have been warned...
To solve this question, I have resorted to the good old investigative testing. Please excuse the comments around the prose. They allow you to copy & paste all source below verbatim, so that you can play with it yourself.
First a little helper function that converts the corner points of MKMapRect back into coordinate space, so that we can compare results of our conversions with the starting coordinates:
NSString* MyStringCoordsFromMapRect(MKMapRect rect) {
MKMapPoint pNE = rect.origin, pSW = rect.origin;
pNE.x += rect.size.width;
pSW.y += rect.size.height;
CLLocationCoordinate2D sw, ne;
sw = MKCoordinateForMapPoint(pSW);
ne = MKCoordinateForMapPoint(pNE);
return [NSString stringWithFormat:#"{{%f, %f}, {%f, %f}}",
sw.latitude, sw.longitude, ne.latitude, ne.longitude];
}
/*
And now, let's test
How To Create MapRect Spanning 180th Meridian:
*/
- (void)testHowToCreateMapRectSpanning180thMeridian
{
/*
We'll use location viewport of Asia, as returned by Google Geocoding API, because it spans the antimeridian. The northeast corner lies already in western hemisphere—longitudal range (-180,0):
*/
CLLocationCoordinate2D sw, ne, nw, se;
sw = CLLocationCoordinate2DMake(-12.9403000, 25.0159000);
ne = CLLocationCoordinate2DMake(81.6691780, -168.3545000);
nw = CLLocationCoordinate2DMake(ne.latitude, sw.longitude);
se = CLLocationCoordinate2DMake(sw.latitude, ne.longitude);
/*
For the reference, here are the bounds of the whole projected world, some 268 million, after converting to MKMapPoints. Our little helper function shows us that the Mercator projection used here is unable to express latitudes above ±85 degrees. Longitude spans nicely from -180 to 180 degrees.
*/
NSLog(#"\nMKMapRectWorld: %#\n => %#",
MKStringFromMapRect(MKMapRectWorld),
MyStringCoordsFromMapRect(MKMapRectWorld));
// MKMapRectWorld: {{0.0, 0.0}, {268435456.0, 268435456.0}}
// => {{-85.051129, -180.000000}, {85.051129, 180.000000}}
/*
Why was the MKPolygon overlay, created using the geo-coordinates, displayed in the wrong place on the map?
*/
// MKPolygon bounds
CLLocationCoordinate2D coords[] = {nw, ne, se, sw};
MKPolygon *p = [MKPolygon polygonWithCoordinates:coords count:4];
MKMapRect rp = p.boundingMapRect;
STAssertFalse(MKMapRectSpans180thMeridian(rp), nil); // Incorrect!!!
NSLog(#"\n rp: %#\n => %#",
MKStringFromMapRect(rp),
MyStringCoordsFromMapRect(rp));
// rp: {{8683514.2, 22298949.6}, {144187420.8, 121650857.5}}
// => {{-12.940300, -168.354500}, {81.669178, 25.015900}}
/*
It looks like the longitudes got swapped the wrong way. Asia is {{-12, 25}, {81, -168}}. The resulting MKMapRect does not pass the test using the MKMapRectSpans180thMeridian function —and we know it should!
False Attempts
So the MKPolygon does not compute the MKMapRect correctly, when the coordinates span the antimeridian. OK, let's create the map rect ourselves. Here are two methods suggested in answers to How to fit a certain bounds consisting of NE and SW coordinates into the visible map view?
... quick way is a slight trick using the MKMapRectUnion function. Create a zero-size MKMapRect from each coordinate and then merge the two rects into one big rect using the function:
*/
// https://stackoverflow.com/a/8496988/41307
MKMapPoint pNE = MKMapPointForCoordinate(ne);
MKMapPoint pSW = MKMapPointForCoordinate(sw);
MKMapRect ru = MKMapRectUnion(MKMapRectMake(pNE.x, pNE.y, 0, 0),
MKMapRectMake(pSW.x, pSW.y, 0, 0));
STAssertFalse(MKMapRectSpans180thMeridian(ru), nil); // Incorrect!!!
STAssertEquals(ru, rp, nil);
NSLog(#"\n ru: %#\n => %#",
MKStringFromMapRect(ru),
MyStringCoordsFromMapRect(ru));
// ru: {{8683514.2, 22298949.6}, {144187420.8, 121650857.5}}
// => {{-12.940300, -168.354500}, {81.669178, 25.015900}}
/*
Curiously, we have the same result as before. It makes sense that MKPolygon should probably compute its bounds using MKRectUnion, anyway.
Now I've done the next one myself, too. Compute the MapRect's origin, width and hight manually, while trying to be fancy and not worry about the correct ordering of the corners.
*/
// https://stackoverflow.com/a/8500002/41307
MKMapRect ra = MKMapRectMake(MIN(pNE.x, pSW.x), MIN(pNE.y, pSW.y),
ABS(pNE.x - pSW.x), ABS(pNE.y - pSW.y));
STAssertFalse(MKMapRectSpans180thMeridian(ru), nil); // Incorrect!!!
STAssertEquals(ra, ru, nil);
NSLog(#"\n ra: %#\n => %#",
MKStringFromMapRect(ra),
MyStringCoordsFromMapRect(ra));
// ra: {{8683514.2, 22298949.6}, {144187420.8, 121650857.5}}
// => {{-12.940300, -168.354500}, {81.669178, 25.015900}}
/*
Hey! It is the same result as before. This is how the latitudes get swapped, when the coordinates cross the antimeridian. And it is probably how the MKMapRectUnion works, too. Not good...
*/
// Let's put the coordinates manually in proper slots
MKMapRect rb = MKMapRectMake(pSW.x, pNE.y,
(pNE.x - pSW.x), (pSW.y - pNE.y));
STAssertFalse(MKMapRectSpans180thMeridian(rb), nil); // Incorrect!!! Still :-(
NSLog(#"\n rb: %#\n => %#",
MKStringFromMapRect(rb),
MyStringCoordsFromMapRect(rb));
// rb: {{152870935.0, 22298949.6}, {-144187420.8, 121650857.5}}
// => {{-12.940300, 25.015900}, {81.669178, -168.354500}}
/*
Remember, the Asia is {{-12, 25}, {81, -168}}. We are getting back the right coordinates, but the MKMapRect does not span the antimeridian according to MKMapRectSpans180thMeridian. What the...?!
The Solution
The hint from MKOverlay.h said:
For overlays that span the 180th meridian, boundingMapRect should have either a negative MinX or a MaxX that is greater than MKMapSizeWorld.width.
None of those conditions is met. What's worse, the rb.size.width is negative 144 million. That's definitely wrong.
We have to correct the rect values when we pass the antimeridian, so that one of those conditions is met:
*/
// Let's correct for crossing 180th meridian
double antimeridianOveflow =
(ne.longitude > sw.longitude) ? 0 : MKMapSizeWorld.width;
MKMapRect rc = MKMapRectMake(pSW.x, pNE.y,
(pNE.x - pSW.x) + antimeridianOveflow,
(pSW.y - pNE.y));
STAssertTrue(MKMapRectSpans180thMeridian(rc), nil); // YES. FINALLY!
NSLog(#"\n rc: %#\n => %#",
MKStringFromMapRect(rc),
MyStringCoordsFromMapRect(rc));
// rc: {{152870935.0, 22298949.6}, {124248035.2, 121650857.5}}
// => {{-12.940300, 25.015900}, {81.669178, 191.645500}}
/*
Finally we have satisfied the MKMapRectSpans180thMeridian. Map rect width is positive. What about the coordinates? Northeast has longitude of 191.6455. Wrapped around the globe (-360), it is -168.3545. Q.E.D.
We have computed the correct MKMapRect that spans the 180th meridian by satisfying the second condition: the MaxX (rc.origin.x + rc.size.width = 152870935.0 + 124248035.2 = 277118970.2) is greater then width of the world (268 million).
What about satisfying the first condition, negative MinX === origin.x?
*/
// Let's correct for crossing 180th meridian another way
MKMapRect rd = MKMapRectMake(pSW.x - antimeridianOveflow, pNE.y,
(pNE.x - pSW.x) + antimeridianOveflow,
(pSW.y - pNE.y));
STAssertTrue(MKMapRectSpans180thMeridian(rd), nil); // YES. AGAIN!
NSLog(#"\n rd: %#\n => %#",
MKStringFromMapRect(rd),
MyStringCoordsFromMapRect(rd));
// rd: {{-115564521.0, 22298949.6}, {124248035.2, 121650857.5}}
// => {{-12.940300, -334.984100}, {81.669178, -168.354500}}
STAssertFalse(MKMapRectEqualToRect(rc, rd), nil);
/*
This also passes the MKMapRectSpans180thMeridian test. And the reverse conversion to geo-coordinates gives us match, except for the southwest longitude: -334.9841. But wrapped around the world (+360), it is 25.0159. Q.E.D.
So there are two correct forms to compute the MKMapRect that spans 180th meridian. One with positive and one with negative origin.
Alternative Method
The negative origin method demonstrated above (rd) corresponds to the result obtained by alternative method suggested by Anna Karenina in another answer to this question:
*/
// https://stackoverflow.com/a/9023921/41307
MKMapPoint points[4];
if (nw.longitude > ne.longitude) {
points[0] = MKMapPointForCoordinate(
CLLocationCoordinate2DMake(nw.latitude, -nw.longitude));
points[0].x = - points[0].x;
}
else
points[0] = MKMapPointForCoordinate(nw);
points[1] = MKMapPointForCoordinate(ne);
points[2] = MKMapPointForCoordinate(se);
points[3] = MKMapPointForCoordinate(sw);
points[3].x = points[0].x;
MKPolygon *p2 = [MKPolygon polygonWithPoints:points count:4];
MKMapRect rp2 = p2.boundingMapRect;
STAssertTrue(MKMapRectSpans180thMeridian(rp2), nil); // Also GOOD!
NSLog(#"\n rp2: %#\n => %#",
MKStringFromMapRect(rp2),
MyStringCoordsFromMapRect(rp2));
// rp2: {{-115564521.0, 22298949.6}, {124248035.2, 121650857.5}}
// => {{-12.940300, -334.984100}, {81.669178, -168.354500}}
/*
So if we manually convert to MKMapPoints and fudge the negative origin, even the MKPolygon can compute the boundingMapRect correctly. Resulting map rect is equivalent to the nagative origin method above (rd).
*/
STAssertTrue([MKStringFromMapRect(rp2) isEqualToString:
MKStringFromMapRect(rd)], nil);
/*
Or should I say almost equivalent... because curiously, the following assertions would fail:
*/
// STAssertEquals(rp2, rd, nil); // Sure, shouldn't compare floats byte-wise!
// STAssertTrue(MKMapRectEqualToRect(rp2, rd), nil);
/*
One would guess they know how to compare floating point numbers, but I digress...
*/
}
This concludes the test function source code.
Displaying Overlay
As mentioned in the question, to debug the problem I've used MKPolygons to visualize what was going on. It turns out that the two forms of MKMapRects that span antimeridian are displayed differently when overlayed on the map. When you approach the antimeridian from the west hemisphere, only the one with negative origin gets displayed. Likewise, the positive origin form is displayed when you approach the 180th meridian from the eastern hemisphere. The MKPolygonView does not handle the spanning of 180th meridian for you. You need to adjust the polygon points yourself.
This is how to create polygon from the map rect:
- (MKPolygon *)polygonFor:(MKMapRect)r
{
MKMapPoint p1 = r.origin, p2 = r.origin, p3 = r.origin, p4 = r.origin;
p2.x += r.size.width;
p3.x += r.size.width; p3.y += r.size.height;
p4.y += r.size.height;
MKMapPoint points[] = {p1, p2, p3, p4};
return [MKPolygon polygonWithPoints:points count:4];
}
I have simply used brute force and added the polygon twice—one in each form.
for (GGeocodeResult *location in locations) {
MKMapRect r = location.mapRect;
[self.debugLocationBounds addObject:[self polygonFor:r]];
if (MKMapRectSpans180thMeridian(r)) {
r.origin.x -= MKMapSizeWorld.width;
[self.debugLocationBounds addObject:[self polygonFor:r]];
}
}
[self.mapView addOverlays:self.debugLocationBounds];
I hope this helps other souls that wander in to the land of the dragons behind the 180th meridian.
According to that comment in MKOverlay.h, if the nw and sw corners were specified as negative MKMapPoint values, the overlay should be "drawn correctly".
If we try this:
//calculation of the nw, ne, se, and sw coordinates goes here
MKMapPoint points[4];
if (nw.longitude > ne.longitude) //does it cross 180th?
{
//Get the mappoint for equivalent distance on
//the "positive" side of the dateline...
points[0] = MKMapPointForCoordinate(
CLLocationCoordinate2DMake(nw.latitude, -nw.longitude));
//Reset the mappoint to the correct side of the dateline,
//now it will be negative (as per Apple comments)...
points[0].x = - points[0].x;
}
else
{
points[0] = MKMapPointForCoordinate(nw);
}
points[1] = MKMapPointForCoordinate(ne);
points[2] = MKMapPointForCoordinate(se);
points[3] = MKMapPointForCoordinate(sw);
points[3].x = points[0].x; //set to same as NW's whether + or -
MKPolygon *p = [MKPolygon polygonWithPoints:points count:4];
[mapView addOverlay:p];
The resulting p.boundingMapRect does return YES for MKMapRectSpans180thMeridian (but the code already figured that out from the coordinates since it didn't have the maprect to begin with).
Unfortunately, however, creating the maprect with the negative values fixes only half the problem. The half of the polygon that is east of the dateline is now drawn correctly. However, the other half on the west of the dateline does not get drawn at all.
Apparently, the built-in MKPolygonView does not call MKMapRectSpans180thMeridian and draw the polygon in two parts.
You can create a custom overlay view and do this drawing yourself (you'd create one overlay but the view would draw two polygons).
Or, you could just create two MKPolygon overlays and let the map view draw them by adding the following after the above code:
if (MKMapRectSpans180thMeridian(p.boundingMapRect))
{
MKMapRect remainderRect = MKMapRectRemainder(p.boundingMapRect);
MKMapPoint remPoints[4];
remPoints[0] = remainderRect.origin;
remPoints[1] = MKMapPointMake(remainderRect.origin.x + remainderRect.size.width, remainderRect.origin.y);
remPoints[2] = MKMapPointMake(remainderRect.origin.x + remainderRect.size.width, remainderRect.origin.y + remainderRect.size.height);
remPoints[3] = MKMapPointMake(remainderRect.origin.x, remainderRect.origin.y + remainderRect.size.height);
MKPolygon *remPoly = [MKPolygon polygonWithPoints:remPoints count:4];
[mapView addOverlay:remPoly];
}
By the way, there is a similar issue with drawing MKPolyline overlays that cross +/-180 (see this question).
In short, if the polygon crosses the antimeridian, check the mapPoints.
If the mapPoint.x is greater than the primeMeridian.x, subtract the width of the world from the mapPoint.x.
This splits the map down the prime meridian. The minX is negative and the mapSize is smaller than the width of the world. Palimondo's answer was hugely helpful figuring this out.
I am working with geodesicPolylines and spent a couple days on this. Finally found an answer!
/// Note: If both the prime meridian and the antimeridian are crossed, an empty polygon will be returned
func makePolygon(lines: [LineAnnotation]) -> MKPolygon {
let polylines = lines.map({ $0.polyline })
let sum = polylines.reduce(0, { $0 + $1.pointCount })
let pointer = UnsafeMutablePointer<MKMapPoint>.allocate(capacity: sum)
var advance = 0
let spans180thMeridian = polylines.contains(where: { $0.boundingMapRect.spans180thMeridian })
let primeMeridianMapPoint = MKMapPoint(CLLocationCoordinate2D(latitude: 0, longitude: 0))
let spansPrimeMeridian = polylines.contains(where: {
return $0.boundingMapRect.minX <= primeMeridianMapPoint.x && $0.boundingMapRect.maxX >= primeMeridianMapPoint.x
})
guard !(spans180thMeridian && spansPrimeMeridian) else { return MKPolygon() }
if spans180thMeridian {
for polyline in polylines {
// initialize the pointer with a copy of the polyline points, adjusted if needed
let points = UnsafeMutablePointer<MKMapPoint>.allocate(capacity: polyline.pointCount)
points.initialize(from: polyline.points(), count: polyline.pointCount)
for i in 0..<polyline.pointCount {
let pointPointer = points.advanced(by: i)
if pointPointer.pointee.x > primeMeridianMapPoint.x {
pointPointer.pointee.x -= MKMapSize.world.width
}
}
pointer.advanced(by: advance).initialize(from: points, count: polyline.pointCount)
advance += polyline.pointCount
points.deinitialize(count: polyline.pointCount)
points.deallocate()
}
} else {
// initialize the pointer with the polyline points
for polyline in polylines {
pointer.advanced(by: advance).initialize(from: polyline.points(), count: polyline.pointCount)
advance += polyline.pointCount
}
}
let polygon = MKPolygon(points: pointer, count: sum)
print(polygon.boundingMapRect)
pointer.deinitialize(count: sum)
pointer.deallocate()
return polygon
}

Hit detection when drawing lines in iOS

I would like to allow the user to draw curves in such a way that no line can cross another line or even itself. Drawing the curves is no problem, and I even found that I can create a path that is closed and still pretty line-like by tracing the nodes of the line forwards and back and then closing the path.
Unfortunately, iOS only provides a test for whether a point is contained in a closed path (containsPoint: and CGPathContainsPoint). Unfortunately, a user can pretty easily move their finger fast enough that the touch points land on both sides of an existing path without actually being contained by that path, so testing the touch points is pretty pointless.
I can't find any "intersection" of paths method.
Any other thoughts on how to accomplish this task?
Well, I did come up with a way to do this. It is imperfect, but I thought others might want to see the technique since this question was upvoted a few times. The technique I used draws all the items to be tested against into a bitmap context and then draws the new segment of the progressing line into another bitmap context. The data in those contexts is compared using bitwise operators and if any overlap is found, a hit is declared.
The idea behind this technique is to test each segment of a newly drawn line against all the previously drawn lines and even against earlier pieces of the same line. In other words, this technique will detect when a line crosses another line and also when it crosses over itself.
A sample app demonstrating the technique is available: LineSample.zip.
The core of hit testing is done in my LineView object. Here are two key methods:
- (CGContextRef)newBitmapContext {
// creating b&w bitmaps to do hit testing
// based on: http://robnapier.net/blog/clipping-cgrect-cgpath-531
// see "Supported Pixel Formats" in Quartz 2D Programming Guide
CGContextRef bitmapContext =
CGBitmapContextCreate(NULL, // data automatically allocated
self.bounds.size.width,
self.bounds.size.height,
8,
self.bounds.size.width,
NULL,
kCGImageAlphaOnly);
CGContextSetShouldAntialias(bitmapContext, NO);
// use CGBitmapContextGetData to get at this data
return bitmapContext;
}
- (BOOL)line:(Line *)line canExtendToPoint:(CGPoint) newPoint {
// Lines are made up of segments that go from node to node. If we want to test for self-crossing, then we can't just test the whole in progress line against the completed line, we actually have to test each segment since one segment of the in progress line may cross another segment of the same line (think of a loop in the line). We also have to avoid checking the first point of the new segment against the last point of the previous segment (which is the same point). Luckily, a line cannot curve back on itself in just one segment (think about it, it takes at least two segments to reach yourself again). This means that we can both test progressive segments and avoid false hits by NOT drawing the last segment of the line into the test! So we will put everything up to the last segment into the hitProgressLayer, we will put the new segment into the segmentLayer, and then we will test for overlap among those two and the hitTestLayer. Any point that is in all three layers will indicate a hit, otherwise we are OK.
if (line.failed) {
// shortcut in case a failed line is retested
return NO;
}
BOOL ok = YES; // thinking positively
// set up a context to hold the new segment and stroke it in
CGContextRef segmentContext = [self newBitmapContext];
CGContextSetLineWidth(segmentContext, 2); // bit thicker to facilitate hits
CGPoint lastPoint = [[[line nodes] lastObject] point];
CGContextMoveToPoint(segmentContext, lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(segmentContext, newPoint.x, newPoint.y);
CGContextStrokePath(segmentContext);
// now we actually test
// based on code from benzado: http://stackoverflow.com/questions/6515885/how-to-do-comparisons-of-bitmaps-in-ios/6515999#6515999
unsigned char *completedData = CGBitmapContextGetData(hitCompletedContext);
unsigned char *progressData = CGBitmapContextGetData(hitProgressContext);
unsigned char *segmentData = CGBitmapContextGetData(segmentContext);
size_t bytesPerRow = CGBitmapContextGetBytesPerRow(segmentContext);
size_t height = CGBitmapContextGetHeight(segmentContext);
size_t len = bytesPerRow * height;
for (int i = 0; i < len; i++) {
if ((completedData[i] | progressData[i]) & segmentData[i]) {
ok = NO;
break;
}
}
CGContextRelease(segmentContext);
if (ok) {
// now that we know we are good to go,
// we will add the last segment onto the hitProgressLayer
int numberOfSegments = [[line nodes] count] - 1;
if (numberOfSegments > 0) {
// but only if there is a segment there!
CGPoint secondToLastPoint = [[[line nodes] objectAtIndex:numberOfSegments-1] point];
CGContextSetLineWidth(hitProgressContext, 1); // but thinner
CGContextMoveToPoint(hitProgressContext, secondToLastPoint.x, secondToLastPoint.y);
CGContextAddLineToPoint(hitProgressContext, lastPoint.x, lastPoint.y);
CGContextStrokePath(hitProgressContext);
}
} else {
line.failed = YES;
[linesFailed addObject:line];
}
return ok;
}
I'd love to hear suggestions or see improvements. For one thing, it would be a lot faster to only check the bounding rect of the new segment instead of the whole view.
Swift 4, answer is based on CGPath Hit Testing - Ole Begemann (2012)
From Ole Begemann blog:
contains(point: CGPoint)
This function is helpful if you want to hit test on the entire region
the path covers. As such, contains(point: CGPoint) doesn’t work with
unclosed paths because those don’t have an interior that would be
filled.
copy(strokingWithWidth lineWidth: CGFloat, lineCap: CGLineCap, lineJoin: CGLineJoin, miterLimit: CGFloat, transform: CGAffineTransform = default) -> CGPath
This function creates a mirroring tap target object that only covers
the stroked area of the path. When the user taps on the screen, we
iterate over the tap targets rather than the actual shapes.
My solution in code
I use a UITapGestureRecognizer linked to the function tap():
var bezierPaths = [UIBezierPath]() // containing all lines already drawn
var tappedPaths = [CAShapeLayer]()
#IBAction func tap(_ sender: UITapGestureRecognizer) {
let point = sender.location(in: imageView)
for path in bezierPaths {
// create tapTarget for path
if let target = tapTarget(for: path) {
if target.contains(point) {
tappedPaths.append(layer)
}
}
}
}
fileprivate func tapTarget(for path: UIBezierPath) -> UIBezierPath {
let targetPath = path.copy(strokingWithWidth: path.lineWidth, lineCap: path..lineCapStyle, lineJoin: path..lineJoinStyle, miterLimit: path.miterLimit)
return UIBezierPath.init(cgPath: targetPath)
}

Resources