Why is my programmatic screenshot capturing out of date view? - ios

I have a map view that allows users to draw perimeter lines, and I need to capture a screenshot when they are done to save for recording purposes. For some reason, my code is capturing the view state before the new overlay is added, even though I add the overlay before attempting the screenshot.
I use a separate view to capture gesture, and then covert the points to an overlay and add it to the map here where points are the points gathered from the gesture tracking
func convertFragments() {
var coordinates: [CLLocationCoordinate2D] = []
for point in points {
let coordinate = mapView.convert(point, toCoordinateFrom: drawingView)
coordinates.append(coordinate)
}
let polyline = MKPolyline(coordinates: coordinates, count: coordinates.count)
polyline.title = selectedTool.name
removeLines(for: selectedTool)
points = []
mapView.addOverlay(polyline)
incidentManager.update(for: selectedTool, value: polyline)
}
Then incidentManager.update(value: polyline) makes a network call to save the information and calls a second method to capture the screenshot and update a log in log(eventType: eventType)
func update(for tool: MapTool, value: Any) {
func saveLine(_ line: MKPolyline, for key: String) {
incidentReference.setData([
key: IncidentManager.convertCLPoints(line.coordinates)
], merge: true)
}
switch tool {
case .hotZone, .innerPerimeter, .outerPerimeter:
let line = value as! MKPolyline
saveLine(line, for: tool.rawValue)
case .commandPost, .stagingArea:
let point = value as! CLLocationCoordinate2D
let geoPoint = GeoPoint(latitude: point.latitude, longitude: point.longitude)
incidentReference.setData([
tool.rawValue: geoPoint
], merge: true)
case .poi:
let annotation = value as! PerimeterMapAnnotation
let point = annotation.coordinate
let geoPoint = GeoPoint(latitude: point.latitude, longitude: point.longitude)
incidentReference.setData([
"pointsOfInterest": [annotation.title: geoPoint]
], merge: true)
default:
return
}
updateAddress(defaultValue: value)
let eventType = tool.eventType(didSet: true)
log(eventType: eventType)
}
I have an extension on UIView that captures the view, but for some reason, its not the most updated version of the map view.
private func log(eventType: LogEventType) {
guard let mapImage = mapView.asImage() else { return }
let storageRef = Storage.storage().reference()
let imageRef = storageRef.child("\(incident.id)/\(UUID().uuidString).jpg")
let eventTime = Date()
let eventTimeString = eventTime.apiDateString
incidentReference.collection("eventLog").document(eventTimeString).setData([
"logEventType": eventType.rawValue,
"imageReference": imageRef.fullPath,
"date": Timestamp(date: eventTime)
])
upload(image: mapImage, at: imageRef)
if shouldUpdateCover {
shouldUpdateCover = false
incidentReference.setData([
"coverPhotoRef": imageRef.fullPath
], merge: true)
}
}
func asImage() -> UIImage? {
UIGraphicsBeginImageContextWithOptions(self.bounds.size, self.isOpaque, 0.0)
defer { UIGraphicsEndImageContext() }
if let context = UIGraphicsGetCurrentContext() {
self.layer.render(in: context)
let image = UIGraphicsGetImageFromCurrentImageContext()
return image
}
return nil
}
I have tried many variations this "screenshot" method that I have found online, and no luck. When I debug, I can check the map and its overlays, and see that they are updating before hand.
Any ideas why the image captured here does not capture the changes made?

Related

Mapbox Navigation in iOS with in my mapView controller

I want to integrate Mapbox navigation in iOS, I can easily get the direction/route between two coordinate also to get the navigation path from mapbox we can use below code
let options = NavigationOptions(styles: nil)
let viewController = NavigationViewController(for: self.directionsRoute!)
viewController.delegate=self
self.present(viewController, animated: true, completion: nil)
But the problem is I want to display the navigation in my mapview which is a part of another view controller, I can do that by getting a direction/route and instruction but I can't find any method which will be called every second so that I can update route instruction, as well as route, in case of user change the path.
Let me know if I am missing anything or any changes needed.
-Thanks in advance
here is my approach:
first i did get only directions instructions from the MapBox api taking advantage of it's free API calls quota and draw the instructions on GMSMapView or MapKit taking advantage of their good performance and memory management.
podfile
pod 'MapboxDirections.swift'
import MapboxDirections
this is done through the below code
have the property for MapBox directions
#IBOutlet weak var googleMapView: GMSMapView!
let locationManager = CLLocationManager()
let mapBoxirections = Directions(accessToken: osmToken)
var path: GMSMutablePath?
then do the actual api call
private func drawRouteBetween(source: StopModel, destination: StopModel) {
guard let name = source.name, let lat = source.latitude, let lng = source.longitude else { return }
guard let nameDest = destination.name, let latDest = destination.latitude, let lngDest = destination.longitude else { return }
let waypoints = [
Waypoint(coordinate: CLLocationCoordinate2D(latitude: lat, longitude: lng), name: name),
Waypoint(coordinate: CLLocationCoordinate2D(latitude: latDest, longitude: lngDest), name: nameDest),
]
let options = RouteOptions(waypoints: waypoints, profileIdentifier: .automobile)
options.includesSteps = true
options.distanceMeasurementSystem = .metric
mapBoxirections.calculate(options) { (waypoints, routes, error) in
guard error == nil else {
print("Error calculating directions: \(error!)")
return
}
if let route = routes?.first, let leg = route.legs.first {
for step in leg.steps {
if let coordinates = step.coordinates {
for (index, point) in coordinates.enumerated() {
let source = point
if index <= coordinates.count - 2 {
let destination = coordinates[index + 1]
self.drawPolyLine(source: source, destination: destination)
}
}
}
}
}
}
}
note that StopModel is my custom made CLLocation so feel free to replace it with your own as long it has the latitude and longitude
create the method that draws Polyline on your CLLocationManagerDelegate as below
private func drawPolyLine(source: CLLocationCoordinate2D, destination: CLLocationCoordinate2D){
path?.add(source)
path?.add(destination)
let polyLine = GMSPolyline(path: path)
polyLine.strokeWidth = 4 // width of your choice
polyLine.strokeColor = .red // color of your choice
polyLine.map = googleMapView
}
then take a look at the MapBoxDirections.Route model and explore it's properties you will find very useful info inside it
and then take advantage of the callback function from the GMS Delegate that notifies you with the location update instead having a timer and calling it every second this is more efficient way
func locationManager(_ manager: CLLocationManager, didUpdateLocations locations: [CLLocation]) {
/* do your business here */
}
do not forget to have the delegate of the location manager to self or the class of your choice
Maybe this helps a bit: you can easily add observer for route progress changes:
NotificationCenter.default.addObserver(self,
selector: #selector(progressDidChange(notification:)),
name: .routeControllerProgressDidChange,
object: navigationService.router)
You need a navigation service with your route by creating it like
let navigationService = MapboxNavigationService(route: route)
The function progressDidChange can do something like:
#objc func progressDidChange(notification: NSNotification) {
guard let routeProgress = notification.userInfo?[RouteControllerNotificationUserInfoKey.routeProgressKey] as? RouteProgress,
let location = notification.userInfo?[RouteControllerNotificationUserInfoKey.locationKey] as? CLLocation else {
return
}
// you have all information you probably need in routeProgress, f.E.
let secondsRemaining = routeProgress.currentLegProgress.currentStepProgress.durationRemaining
...
}

How to draw a CLLocationCoordinate2Ds on MKMapSnapshotter (drawing on mapView printed image)

I have mapView with array of CLLocationCoordinate2D. I use these locations to draw lines on my mapView by using MKPolyline. Now i want to store it as a UIimage. I found that theres class MKMapSnapshotter but unfortunately i can't draw overlays on it "Snapshotter objects do not capture the visual representations of any overlays or annotations that your app creates." So i get only blank map image. Is there any way to get image with my overlays?
private func generateImageFromMap() {
let mapSnapshotterOptions = MKMapSnapshotter.Options()
guard let region = mapRegion() else { return }
mapSnapshotterOptions.region = region
mapSnapshotterOptions.size = CGSize(width: 200, height: 200)
mapSnapshotterOptions.showsBuildings = false
mapSnapshotterOptions.showsPointsOfInterest = false
let snapShotter = MKMapSnapshotter(options: mapSnapshotterOptions)
snapShotter.start() { snapshot, error in
guard let snapshot = snapshot else {
//do something with image ....
let mapImage = snapshot...
}
}
}
How can i put overlays on this image? Or maybe theres other way for that problem.
Unfortunately, you have to draw them yourself. Fortunately, MKSnapshot has a convenient point(for:) method to convert a CLLocationCoordinate2D into a CGPoint within the snapshot.
For example, assume you had an array of CLLocationCoordinate2D:
private var coordinates: [CLLocationCoordinate2D]?
private func generateImageFromMap() {
guard let region = mapRegion() else { return }
let options = MKMapSnapshotter.Options()
options.region = region
options.size = CGSize(width: 200, height: 200)
options.showsBuildings = false
options.showsPointsOfInterest = false
MKMapSnapshotter(options: options).start() { snapshot, error in
guard let snapshot = snapshot else { return }
let mapImage = snapshot.image
let finalImage = UIGraphicsImageRenderer(size: mapImage.size).image { _ in
// draw the map image
mapImage.draw(at: .zero)
// only bother with the following if we have a path with two or more coordinates
guard let coordinates = self.coordinates, coordinates.count > 1 else { return }
// convert the `[CLLocationCoordinate2D]` into a `[CGPoint]`
let points = coordinates.map { coordinate in
snapshot.point(for: coordinate)
}
// build a bezier path using that `[CGPoint]`
let path = UIBezierPath()
path.move(to: points[0])
for point in points.dropFirst() {
path.addLine(to: point)
}
// stroke it
path.lineWidth = 1
UIColor.blue.setStroke()
path.stroke()
}
// do something with finalImage
}
}
Then the following map view (with the coordinates, as MKPolyline, rendered by mapView(_:rendererFor:), like usual):
The above code will create the this finalImage:

I get an empty CLLocationCoordinates array when loading data from user defaults

I'm trying to store to UserDefaults an array of CCLocationCoordinates from the tracking portion of my app paired with the name of the tracked route as key, to be able to recall it later on to use it within a function.
The problem is that when I call that function I get the index out of range error. I checked and the array is empty.
As I'm new to user defaults I tried to see other similar posts but they're all about NSUserDefaults and didn't find a solution.
Heres the code for the functions for storing and recalling the array:
func stopTracking2() {
self.trackingIsActive = false
self.trackigButton.backgroundColor = UIColor.yellow
locationManager.stopUpdatingLocation()
let stopRoutePosition = RouteAnnotation(title: "Route Stop", coordinate: (locationManager.location?.coordinate)!, imageName: "Route Stop")
self.actualRouteInUseAnnotations.append(stopRoutePosition)
print(actualRouteInUseCoordinatesArray)
print(actualRouteInUseAnnotations)
drawRoutePolyline() // draw line to show route
// checkAlerts2() // check if there is any notified problem on our route and marks it with a blue circle, now called at programmed checking
saveRouteToUserDefaults()
postRouteToAnalitics() // store route anonymously to FIrebase
}
func saveRouteToUserDefaults() {
// save actualRouteInUseCoordinatesArray : change for function
// userDefaults.set(actualRouteInUseCoordinatesArray, forKey: "\(String(describing: userRoute))")
storeCoordinates(actualRouteInUseCoordinatesArray)
}
// Store an array of CLLocationCoordinate2D
func storeCoordinates(_ coordinates: [CLLocationCoordinate2D]) {
let locations = coordinates.map { coordinate -> CLLocation in
return CLLocation(latitude: coordinate.latitude, longitude: coordinate.longitude)
}
let archived = NSKeyedArchiver.archivedData(withRootObject: locations)
userDefaults.set(archived, forKey: "\(String(describing: userRoute))")
userDefaults.synchronize()
}
func loadRouteFromUserDefaults() {
// gets entry from userRouteArray stored in userDefaults and append them into actualRouteInUseCoordinatesArray
actualRouteInUseCoordinatesArray.removeAll()
actualRouteInUseCoordinatesArray = userDefaults.object(forKey: "\(String(describing: userRoute))") as? [CLLocationCoordinate2D] ?? [CLLocationCoordinate2D]() // here we get the right set of coordinates for the route we are about to do the check on
// load route coordinates from UserDefaults
// actualRouteInUseCoordinatesArray = loadCoordinates()! //error found nil
}
// Return an array of CLLocationCoordinate2D
func loadCoordinates() -> [CLLocationCoordinate2D]? {
guard let archived = userDefaults.object(forKey: "\(String(describing: userRoute))") as? Data,
let locations = NSKeyedUnarchiver.unarchiveObject(with: archived) as? [CLLocation] else {
return nil
}
let coordinates = locations.map { location -> CLLocationCoordinate2D in
return location.coordinate
}
return coordinates
}
}
extension NewMapViewController {
// ALERTS :
func checkAlerts2() {
loadRouteFromUserDefaults() //load route coordinates to check in
// CHECK IF ANY OBSTACLE IS OUN OUR ROUTE BY COMPARING DISTANCES
while trackingCoordinatesArrayPosition != ( (actualRouteInUseCoordinatesArray.count) - 1) {
print("checking is started")
print(actualRouteInUseCoordinatesArray)
let trackingLatitude = actualRouteInUseCoordinatesArray[trackingCoordinatesArrayPosition].latitude
let trackingLongitude = actualRouteInUseCoordinatesArray[trackingCoordinatesArrayPosition].longitude
let alertLatitude = alertNotificationCoordinatesArray[alertNotificationCoordinatesArrayPosition].latitude
let alertLongitude = alertNotificationCoordinatesArray[alertNotificationCoordinatesArrayPosition].longitude
let coordinateFrom = CLLocation(latitude: trackingLatitude, longitude: trackingLongitude)
let coordinateTo = CLLocation(latitude: alertLatitude, longitude: alertLongitude)
let coordinatesDistanceInMeters = coordinateFrom.distance(from: coordinateTo)
// CHECK SENSITIVITY: sets the distance in meters for an alert to be considered an obstacle
if coordinatesDistanceInMeters <= 10 {
print( "found problem")
routeObstacle.append(alertNotificationCoordinatesArray[alertNotificationCoordinatesArrayPosition]) // populate obstacles array
trackingCoordinatesArrayPosition = ( trackingCoordinatesArrayPosition + 1)
}
else if alertNotificationCoordinatesArrayPosition < ((alertNotificationCoordinatesArray.count) - 1) {
alertNotificationCoordinatesArrayPosition = alertNotificationCoordinatesArrayPosition + 1
}
else if alertNotificationCoordinatesArrayPosition == (alertNotificationCoordinatesArray.count - 1) {
trackingCoordinatesArrayPosition = ( trackingCoordinatesArrayPosition + 1)
alertNotificationCoordinatesArrayPosition = 0
}
}
findObstacles()
NewMapViewController.checkCounter = 0
displayObstacles()
}
In the extension you can see the function that uses the array.
Right after the print of the array I get the index out of range error.
Thanks as usual to the community.
After trying various solutions offered I decided to rewrite the whole thing.
So after finding a post on how to code/decode my array to string I decided it was the way to go. It shouldn't be heavy on the system as it's a string that gets saved. Please let me know what you think of this solution.
Thank to #Sh_Khan to point out it was a decoding issue, and to #Moritz to point out I was performing a bad practice.
So the code is:
func storeRoute() {
// first we code the CLLocationCoordinate2D array to string
// second we store string into userDefaults
userDefaults.set(encodeCoordinates(coords: actualRouteInUseCoordinatesArray), forKey: "\(String(describing: NewMapViewController.userRoute))")
}
func loadRoute() {
//first se load string from user defaults
let route = userDefaults.string(forKey: "\(String(describing: NewMapViewController.userRoute))")
print("loaded route is \(route!))")
//second we decode it into CLLocationCoordinate2D array
actualRouteInUseCoordinatesArray = decodeCoordinates(encodedString: route!)
print("decoded route array is \(actualRouteInUseCoordinatesArray))")
}
func encodeCoordinates(coords: [CLLocationCoordinate2D]) -> String {
let flattenedCoords: [String] = coords.map { coord -> String in "\(coord.latitude):\(coord.longitude)" }
let encodedString: String = flattenedCoords.joined(separator: ",")
return encodedString
}
func decodeCoordinates(encodedString: String) -> [CLLocationCoordinate2D] {
let flattenedCoords: [String] = encodedString.components(separatedBy: ",")
let coords: [CLLocationCoordinate2D] = flattenedCoords.map { coord -> CLLocationCoordinate2D in
let split = coord.components(separatedBy: ":")
if split.count == 2 {
let latitude: Double = Double(split[0]) ?? 0
let longitude: Double = Double(split[1]) ?? 0
return CLLocationCoordinate2D(latitude: latitude, longitude: longitude)
} else {
return CLLocationCoordinate2D()
}
}
return coords
}
Rather than using heavy-weight objectiv-c-ish NSKeyed(Un)Archiver and making a detour via CLLocation I recommend to extend CLLocationCoordinate2D to adopt Codable
extension CLLocationCoordinate2D : Codable {
public init(from decoder: Decoder) throws {
var arrayContainer = try decoder.unkeyedContainer()
if arrayContainer.count == 2 {
let lat = try arrayContainer.decode(CLLocationDegrees.self)
let lng = try arrayContainer.decode(CLLocationDegrees.self)
self.init(latitude: lat, longitude: lng)
} else {
throw DecodingError.dataCorruptedError(in: arrayContainer, debugDescription: "Coordinate array must contain two items")
}
}
public func encode(to encoder: Encoder) throws {
var arrayContainer = encoder.unkeyedContainer()
try arrayContainer.encode(contentsOf: [latitude, longitude])
}
}
and replace the methods to load and save data with
func storeCoordinates(_ coordinates: [CLLocationCoordinate2D]) throws {
let data = try JSONEncoder().encode(coordinates)
UserDefaults.standard.set(data, forKey: String(describing: userRoute))
}
func loadCoordinates() -> [CLLocationCoordinate2D] {
guard let data = UserDefaults.standard.data(forKey: String(describing: userRoute)) else { return [] }
do {
return try JSONDecoder().decode([CLLocationCoordinate2D].self, from: data)
} catch {
print(error)
return []
}
}
storeCoordinates throws it hands over a potential encoding error
Load the data with
actualRouteInUseCoordinatesArray = loadCoordinates()
and save it
do {
try storeCoordinates(actualRouteInUseCoordinatesArray)
} catch { print(error) }
Your problem is that you save it as data and try to read directly without unarchiving , You can try
let locations = [CLLocation(latitude: 123, longitude: 344),CLLocation(latitude: 123, longitude: 344),CLLocation(latitude: 123, longitude: 344)]
do {
let archived = try NSKeyedArchiver.archivedData(withRootObject: locations, requiringSecureCoding: true)
UserDefaults.standard.set(archived, forKey:"myKey")
// read savely
if let data = UserDefaults.standard.data(forKey: "myKey") {
let saved = try NSKeyedUnarchiver.unarchiveTopLevelObjectWithData(data) as! [CLLocation]
print(saved)
}
}
catch {
print(error)
}

How do you add MKPolylines to MKSnapShotter in swift 3?

Is there a way to take a screenshot of mapView and include the polyline? I believe I need to draw CGPoint's on the image that the MKSnapShotter returns, but I am unsure on how to do so.
Current code
func takeSnapshot(mapView: MKMapView, withCallback: (UIImage?, NSError?) -> ()) {
let options = MKMapSnapshotOptions()
options.region = mapView.region
options.size = mapView.frame.size
options.scale = UIScreen.main().scale
let snapshotter = MKMapSnapshotter(options: options)
snapshotter.start() { snapshot, error in
guard snapshot != nil else {
withCallback(nil, error)
return
}
if let image = snapshot?.image{
withCallback(image, nil)
for coordinate in self.area {
image.draw(at:snapshot!.point(for: coordinate))
}
}
}
}
I had the same problem today. After several hours of research, here is how I solve it.
The following codes are in Swift 3.
1. Init your polyline coordinates array
// initial this array with your polyline coordinates
var yourCoordinates = [CLLocationCoordinate2D]()
yourCoorinates.append( coordinate 1 )
yourCoorinates.append( coordinate 2 )
...
// you can use any data structure you like
2. take the snapshot as usual, but set the region based on your coordinates:
func takeSnapShot() {
let mapSnapshotOptions = MKMapSnapshotOptions()
// Set the region of the map that is rendered. (by polyline)
let polyLine = MKPolyline(coordinates: &yourCoordinates, count: yourCoordinates.count)
let region = MKCoordinateRegionForMapRect(polyLine.boundingMapRect)
mapSnapshotOptions.region = region
// Set the scale of the image. We'll just use the scale of the current device, which is 2x scale on Retina screens.
mapSnapshotOptions.scale = UIScreen.main.scale
// Set the size of the image output.
mapSnapshotOptions.size = CGSize(width: IMAGE_VIEW_WIDTH, height: IMAGE_VIEW_HEIGHT)
// Show buildings and Points of Interest on the snapshot
mapSnapshotOptions.showsBuildings = true
mapSnapshotOptions.showsPointsOfInterest = true
let snapShotter = MKMapSnapshotter(options: mapSnapshotOptions)
snapShotter.start() { snapshot, error in
guard let snapshot = snapshot else {
return
}
// Don't just pass snapshot.image, pass snapshot itself!
self.imageView.image = self.drawLineOnImage(snapshot: snapshot)
}
}
3. Use snapshot.point() to draw Polylines on Snapshot Image
func drawLineOnImage(snapshot: MKMapSnapshot) -> UIImage {
let image = snapshot.image
// for Retina screen
UIGraphicsBeginImageContextWithOptions(self.imageView.frame.size, true, 0)
// draw original image into the context
image.draw(at: CGPoint.zero)
// get the context for CoreGraphics
let context = UIGraphicsGetCurrentContext()
// set stroking width and color of the context
context!.setLineWidth(2.0)
context!.setStrokeColor(UIColor.orange.cgColor)
// Here is the trick :
// We use addLine() and move() to draw the line, this should be easy to understand.
// The diificult part is that they both take CGPoint as parameters, and it would be way too complex for us to calculate by ourselves
// Thus we use snapshot.point() to save the pain.
context!.move(to: snapshot.point(for: yourCoordinates[0]))
for i in 0...yourCoordinates.count-1 {
context!.addLine(to: snapshot.point(for: yourCoordinates[i]))
context!.move(to: snapshot.point(for: yourCoordinates[i]))
}
// apply the stroke to the context
context!.strokePath()
// get the image from the graphics context
let resultImage = UIGraphicsGetImageFromCurrentImageContext()
// end the graphics context
UIGraphicsEndImageContext()
return resultImage!
}
That's it, hope this helps someone.
References
How do I draw on an image in Swift?
MKTile​Overlay,MKMap​Snapshotter & MKDirections
Creating an MKMapSnapshotter with an MKPolylineRenderer
Render a Map as an Image using MapKit
What is wrong with:
snapshotter.start( completionHandler: { snapshot, error in
guard snapshot != nil else {
withCallback(nil, error)
return
}
if let image = snapshot?.image {
withCallback(image, nil)
for coordinate in self.area {
image.draw(at:snapshot!.point(for: coordinate))
}
}
})
If you just want a copy of the image the user sees in the MKMapView, remember that it's a UIView subclass, and so you could do this...
public extension UIView {
public var snapshot: UIImage? {
get {
UIGraphicsBeginImageContextWithOptions(self.bounds.size, false, UIScreen.main.scale)
self.drawHierarchy(in: self.bounds, afterScreenUpdates: true)
let image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return image
}
}
}
// ...
if let img = self.mapView.snapshot {
// Do something
}

Create MapKit Circle overlay from multiple CloudKit records

I've been trying to add a new map view to my app which shows an overlay of all of the Geofenced regions in my CloudKit database.
At the moment I'm able to create pins from each of the locations with the following code.
func fetchData() {
let predicate = NSPredicate(format: "TRUEPREDICATE", argumentArray: nil)
let query = CKQuery(recordType: "Collection", predicate: predicate)
let operation = CKQueryOperation(query: query)
operation.desiredKeys = ["Location"]
operation.recordFetchedBlock = { (record : CKRecord) in
self.collectionLocation = record.objectForKey("Location") as? CLLocation
print(self.collectionLocation?.coordinate.latitude)
self.buildBubbles()
}
publicDB!.addOperation(operation)
operation.queryCompletionBlock = {(cursor, error) in
dispatch_async(dispatch_get_main_queue()) {
if error == nil {
} else {
print("error description = \(error?.description)")
}
}
}
}
func buildBubbles() {
if CLLocationManager.isMonitoringAvailableForClass(CLCircularRegion.self) {
let intrepidLat: CLLocationDegrees = (self.collectionLocation?.coordinate.latitude)!
let intrepidLong: CLLocationDegrees = (self.collectionLocation?.coordinate.longitude)!
let title = "Item"
let coordinate = CLLocationCoordinate2DMake(intrepidLat, intrepidLong)
let regionRadius = 300.0
let region = CLCircularRegion(center: CLLocationCoordinate2D(latitude: coordinate.latitude,
longitude: coordinate.longitude), radius: regionRadius, identifier: title)
self.locationManager.startMonitoringForRegion(region)
let restaurantAnnotation = MKPointAnnotation()
restaurantAnnotation.coordinate = coordinate;
restaurantAnnotation.title = "\(title)"
self.mapView.addAnnotation(restaurantAnnotation)
// Overlay code goes here
}
else {
print("System can't track regions")
}
}
But when I go to add the overlay:
let circle = MKCircle(centerCoordinate: coordinate, radius: regionRadius)
self.mapView.addOverlay(circle)
The app fails with error:
"This application is modifying the autolayout engine from a background
thread, which can lead to engine corruption and weird crashes. This
will cause an exception in a future release."
My guess is that I'm doing too much inside the background thread but when I move the "buildBubbles" function into the main queue it adds the circle overlay but only adds one of the Locations to the map.
Thanks for taking the time to look I would really appreciate any help.
Your interface into the bubbles function only provides for holding one location. Try changing the interface, such as to an array, and then see what you get. You will also need to worry about how you actually synchronize one versus the other
I did as Feldur suggested and created an array from the CloudKit Data then moved the MapKit set up from the background thread.
func fetchBubble() {
let query = CKQuery(recordType: "Collection", predicate: NSPredicate(format: "TRUEPREDICATE", argumentArray: nil))
publicDB!.performQuery(query, inZoneWithID: nil) { results, error in
if error == nil {
for collection in results! {
let collectionLocation = collection.valueForKey("Location") as? CLLocation
let collectionName = collection.valueForKey("Name") as! String
dispatch_async(dispatch_get_main_queue(), { () -> Void in
if CLLocationManager.isMonitoringAvailableForClass(CLCircularRegion.self) {
let intrepidLat: CLLocationDegrees = (collectionLocation?.coordinate.latitude)!
let intrepidLong: CLLocationDegrees = (collectionLocation?.coordinate.longitude)!
let title = collectionName
let coordinate = CLLocationCoordinate2DMake(intrepidLat, intrepidLong)
let regionRadius = 50.0
let region = CLCircularRegion(center: CLLocationCoordinate2D(latitude: coordinate.latitude,
longitude: coordinate.longitude), radius: regionRadius, identifier: title)
self.locationManager.startMonitoringForRegion(region)
let restaurantAnnotation = MKPointAnnotation()
self.mapView.addAnnotation(restaurantAnnotation)
restaurantAnnotation.coordinate = coordinate
let circle = MKCircle(centerCoordinate: coordinate, radius: regionRadius)
self.mapView.addOverlay(circle)
self.numberOfObjectsInMyArray()
}
else {
print("System can't track regions")
}
})
}
}
else {
print(error)
}
}
}

Resources