SwiftUI: How to generate a Map Route with transparent background? - ios

I'm trying to create a map that will show the route of a workout, but I can't find anywhere in MapKit documentation as to how to customize the background, i.e. here I want the map itself to be transparent so that only the route (annotations) are visible. How can I do this?
struct MapOverlay: View {
#ObservedObject var workoutDetailViewModel: WorkoutDetailViewModel
var body: some View {
if let unwrappedWorkoutLocations = workoutDetailViewModel.fullyLoadedWorkout?.workoutLocations {
Map(
coordinateRegion: .constant(
MKCoordinateRegion(
center: unwrappedWorkoutLocations.map {$0.coordinate}.center(), // Use the midpoint of the workout as the centre of the map.
span: MKCoordinateSpan(latitudeDelta: 0.02, longitudeDelta: 0.02)
)
),
annotationItems: unwrappedWorkoutLocations.map {$0.coordinate}
) { routeLocation in
MapAnnotation(coordinate: routeLocation) {
Circle().fill(TrackerConstants.AppleFitnessOrange)
}
}
.cornerRadius(10)
}
}
}
struct MapOverlay_Previews: PreviewProvider {
static var previews: some View {
MapOverlay(workoutDetailViewModel: WorkoutDetailViewModel())
}
}

Don't use MKMapView. You want to take the coordinates and make a UIBezierPath from them, and render that into your own view, or a UIImage. Something like this playground:
import CoreLocation
import MapKit
let myCoords: [CLLocationCoordinate2D] = [
.init(latitude: 42.42, longitude: 42.42),
.init(latitude: 42.43, longitude: 42.425),
.init(latitude: 42.425, longitude: 42.427),
.init(latitude: 42.422, longitude: 42.426),
]
let r = MKPolylineRenderer(polyline: .init(coordinates: myCoords, count: myCoords.count))
let path = r.path!
let bezier = UIBezierPath(cgPath: path)
bezier.apply(.init(scaleX: 0.05, y: 0.05))
let renderer = UIGraphicsImageRenderer(bounds: .init(x: 0, y: 0, width: 640, height: 480))
let image = renderer
.image { context in
let size = renderer.format.bounds.size
UIColor.darkGray.setFill()
context.fill(CGRect(x: 0, y: 0, width: size.width, height: size.height))
UIColor.black.setStroke()
bezier.lineWidth = 5
bezier.stroke()
}

Related

How I create Octagon shape in google map?

I would like to draw a Octagonal shape.
I don't know how to achieve this. Here is my code -
func drawOctagonalShape(){
let camera = GMSCameraPosition.camera(withLatitude: 37.4, longitude:-122.0, zoom: 10)
let mapView = GMSMapView.map(withFrame: CGRect.zero, camera: camera)
mapView.isMyLocationEnabled = true
mapView.settings.myLocationButton = true;
self.view = mapView
// Creates a marker in the center of the map.
let marker = GMSMarker()
marker.position = CLLocationCoordinate2D(latitude: 37.36, longitude: -122.0)
marker.title = "India"
//marker.snippet = "Malaysia"
marker.map = mapView
let rect = GMSMutablePath()
rect.add(CLLocationCoordinate2D(latitude: 37.36, longitude: -122.0))
rect.add(CLLocationCoordinate2D(latitude: 37.45, longitude: -122.0))
rect.add(CLLocationCoordinate2D(latitude: 37.45, longitude: -122.2))
rect.add(CLLocationCoordinate2D(latitude: 37.36, longitude: -122.2))
// Create the polygon, and assign it to the map.
let polygon = GMSPolygon(path: rect)
polygon.fillColor = UIColor(red: 0.25, green: 0, blue: 0, alpha: 0.05);
polygon.strokeColor = UIColor.init(hue: 210, saturation: 88, brightness: 84, alpha: 1)
//polygon.strokeColor = .black
polygon.strokeWidth = 2
polygon.map = mapView
}
Octagon is basically 8 points distributed equally on a circle. You can create a method like the following:
func generateCircle(around center: CLLocationCoordinate2D, radius: Double, pointCount: Int = 8, rotation: Double = 0) -> [CLLocationCoordinate2D] {
(0..<count).map { index in
let angle = rotation + (Double(index)/Double(pointCount))*Double.pi*2.0
return .init(latitude: center.latitude + cos(angle)*radius, longitude: center.longitude + sin(angle)*radius)
}
}
I hope the code speaks for itself. You would use it in your code like
let rect = GMSMutablePath()
generateCircle(around center: myCenter, radius: someSmallRadius, pointCount: 8, rotation: 0).forEach { coordinate in
rect.add(coordinate)
}
I would expect this to work in most cases. There are 2 potential problems however:
I am unsure how this would behave around edges of both longitude and latitude.
Radius expressed in degrees may be a bit incorrect at some points
Would it not be easier to simply generate an image and place that on your map?

How to update UIViewRepresentable map through Binding and ObservedObject

I have this simple MapView and I have an ObservedObject datasource to update but is not work.
This is the content view:
struct ContentView: View {
#ObservedObject var trackingOnMapViewModel = TrackingOnMapViewModel()
var body: some View {
ZStack {
MapView(selectedRegion: $trackingOnMapViewModel.selectedRegion)
.edgesIgnoringSafeArea(.vertical)
VStack {
Spacer()
Button(action: {
self.trackingOnMapViewModel.selectNextRegion()
}) {
Text("Next")
.padding()
}
}
}
}
}
This is a simple mapview:
struct MapView: UIViewRepresentable {
#Binding var selectedRegion: MKCoordinateRegion
func makeUIView(context: Context) -> MKMapView {
print("MapView - makeUIView")
let map = MKMapView()
return map
}
func updateUIView(_ mapView: MKMapView, context: Context) {
print("MapView - updateUIView")
mapView.setRegion(selectedRegion, animated: true)
}
}
And this is the datasource:
class TrackingOnMapViewModel: ObservableObject {
var regions: [MKCoordinateRegion] = [
MKCoordinateRegion(center: CLLocationCoordinate2D(latitude: 10.0, longitude: 10.0), span: MKCoordinateSpan(latitudeDelta: 2.0, longitudeDelta: 2.0)),
MKCoordinateRegion(center: CLLocationCoordinate2D(latitude: 20.0, longitude: 20.0), span: MKCoordinateSpan(latitudeDelta: 2.0, longitudeDelta: 2.0)),
MKCoordinateRegion(center: CLLocationCoordinate2D(latitude: 30.0, longitude: 30.0), span: MKCoordinateSpan(latitudeDelta: 2.0, longitudeDelta: 2.0)),
MKCoordinateRegion(center: CLLocationCoordinate2D(latitude: 40.0, longitude: 40.0), span: MKCoordinateSpan(latitudeDelta: 2.0, longitudeDelta: 2.0))
]
var selectedRegion: MKCoordinateRegion = MKCoordinateRegion(center: CLLocationCoordinate2D(latitude: 0.0, longitude: 0.0), span: MKCoordinateSpan(latitudeDelta: 2.0, longitudeDelta: 2.0))
var currentIndex = 0
func selectNextRegion() {
print("TrackingOnMapViewModel - selectNextLandmark")
currentIndex = currentIndex < regions.count-1 ? currentIndex + 1 : 0
self.selectedRegion = regions[currentIndex]
print("selectedRegion - \(selectedRegion)")
}
}
In this case the map is not updated.
If I put the logic to the ContentView without the ObservedObject like this:
struct ContentView: View {
// #ObservedObject var trackingOnMapViewModel = TrackingOnMapViewModel()
#State var regions: [MKCoordinateRegion] = [
MKCoordinateRegion(center: CLLocationCoordinate2D(latitude: 10.0, longitude: 10.0), span: MKCoordinateSpan(latitudeDelta: 2.0, longitudeDelta: 2.0)),
MKCoordinateRegion(center: CLLocationCoordinate2D(latitude: 20.0, longitude: 20.0), span: MKCoordinateSpan(latitudeDelta: 2.0, longitudeDelta: 2.0)),
MKCoordinateRegion(center: CLLocationCoordinate2D(latitude: 30.0, longitude: 30.0), span: MKCoordinateSpan(latitudeDelta: 2.0, longitudeDelta: 2.0)),
MKCoordinateRegion(center: CLLocationCoordinate2D(latitude: 40.0, longitude: 40.0), span: MKCoordinateSpan(latitudeDelta: 2.0, longitudeDelta: 2.0))
]
#State var selectedRegion: MKCoordinateRegion = MKCoordinateRegion(center: CLLocationCoordinate2D(latitude: 0.0, longitude: 0.0), span: MKCoordinateSpan(latitudeDelta: 2.0, longitudeDelta: 2.0))
#State var currentIndex = 0
var body: some View {
ZStack {
MapView(selectedRegion: $selectedRegion)
.edgesIgnoringSafeArea(.vertical)
VStack {
Spacer()
Button(action: {
self.selectNextRegion()
}) {
Text("Next")
.padding()
}
}
}
}
private func selectNextRegion() {
print("ContentView - selectNextLandmark")
currentIndex = currentIndex < regions.count-1 ? currentIndex + 1 : 0
self.selectedRegion = regions[currentIndex]
}
}
then the map is updating.
Could you help me with this?
At the moment when I posted the question I found the problem.
I've forgot to mark the selectedRegion variable as #Published in the TrackingOnMapViewModel.
#Published var selectedRegion: MKCoordinateRegion = MKCoordinateRegion

Google maps draws circle at wrong position

I'm showing marker and drawing circle using the same coordinates. Don't know why but my marker is offset a little bit to the top. How do you think what is the problem?
override func viewDidLoad() {
super.viewDidLoad()
let camera = GMSCameraPosition.camera(withLatitude: 37.36, longitude: -122.0, zoom: 13.0)
setupMapWith(cameraPosition: camera)
showMarker(position: camera.target)
circleWith(position: camera.target)
}
func showMarker(position: CLLocationCoordinate2D) {
let marker = GMSMarker()
marker.position = position
marker.title = "Palo Alto"
marker.snippet = "San Francisco"
let markerView = UIView(frame: CGRect(x: 0, y: 0, width: 20, height: 20))
markerView.backgroundColor = .red
marker.iconView = markerView
marker.map = mapView
}
func circleWith(position: CLLocationCoordinate2D) {
let circ = GMSCircle(position: position, radius: 1000)
circ.fillColor = UIColor.MapAreaFilter.areaFilterColor
circ.map = mapView
}
Try setting your marker's groundAnchor property:
marker.groundAnchor = CGPoint(x: 0.5, y: 0.5)

Convert VNRectangleObservation points to other coordinate system

I need to convert the VNRectangleObservation received CGPoints (bottomLeft,
bottomRight, topLeft, topRight) to another coordinate system (e.g. a view's coordinate on screen).
I define a request:
// Rectangle Request
let rectangleDetectionRequest = VNDetectRectanglesRequest(completionHandler: handleRectangles)
rectangleDetectionRequest.minimumSize = 0.5
rectangleDetectionRequest.maximumObservations = 1
I get the sampleBuffer from camera in delegate call, and perform a detection request:
func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
guard let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else {return}
var requestOptions:[VNImageOption:Any] = [:]
if let cameraIntrinsicData = CMGetAttachment(sampleBuffer, kCMSampleBufferAttachmentKey_CameraIntrinsicMatrix, nil) {
requestOptions = [.cameraIntrinsics:cameraIntrinsicData]
}
let imageRequestHandler = VNImageRequestHandler(cvPixelBuffer: pixelBuffer, orientation: CGImagePropertyOrientation(rawValue:6)!, options: requestOptions)
do {
try imageRequestHandler.perform(self.requests)
} catch {
print(error)
}
}
Later in completionHandler I receive the results:
func handleRectangles (request:VNRequest, error:Error?) {
guard let results = request.results as? [VNRectangleObservation] else { return }
let flipTransform = CGAffineTransform(scaleX: 1, y: -1).translatedBy(x: 0, y: -self.previewView.frame.height)
let scaleTransform = CGAffineTransform.identity.scaledBy(x: self.previewView.frame.width, y: self.previewView.frame.height)
for rectangle in results {
let rectangleBounds = rectangle.boundingBox.applying(scaleTransform).applying(flipTransform)
// convertedTopLeft = conversion(rectangle.topLeft)
// convertedTopRight = conversion(rectangle.topRight)
// convertedBottomLeft = conversion(rectangle.bottomLeft)
// convertedBottomRight = conversion(rectangle.bottomRight)
}
}
This works for boundingBox which is CGRect, but I need to transform the CGPoints instead to a coordinate system of another view.
The problem is that I don't know how to get the transformation from the sampleBuffer: CMSampleBuffer's coordinate system to the previewView coordinate system.
Thanks!
That was simply a matter of applying the transform to the CGPoint itself where size is the CGSize of the destination view for which I need transpose the four points.
let transform = CGAffineTransform.identity
.scaledBy(x: 1, y: -1)
.translatedBy(x: 0, y: -size.height)
.scaledBy(x: size.width, y: size.height)
let convertedTopLeft = rectangle.topLeft.applying(transform)
let convertedTopRight = rectangle.topRight.applying(transform)
let convertedBottomLeft = rectangle.bottomLeft.applying(transform)
let convertedBottomRight = rectangle.bottomRight.applying(transform)
#mihaicris answer works, but only in portrait mode. In landscape, we need to do it a little different.
if UIApplication.shared.statusBarOrientation.isLandscape {
transform = CGAffineTransform.identity
.scaledBy(x: -1, y: 1)
.translatedBy(x: -size.width, y: 0)
.scaledBy(x: size.width, y: size.height)
} else {
transform = CGAffineTransform.identity
.scaledBy(x: 1, y: -1)
.translatedBy(x: 0, y: -size.height)
.scaledBy(x: size.width, y: size.height)
}
let convertedTopLeft = rectangle.topLeft.applying(transform)
let convertedTopRight = rectangle.topRight.applying(transform)
let convertedBottomLeft = rectangle.bottomLeft.applying(transform)
let convertedBottomRight = rectangle.bottomRight.applying(transform)
I assume you use layer for the camera, and the layer is AVCaptureVideoPreviewLayer. (https://developer.apple.com/documentation/avfoundation/avcapturevideopreviewlayer).
So if you want to convert single point, use this function:layerPointConverted (https://developer.apple.com/documentation/avfoundation/avcapturevideopreviewlayer/1623502-layerpointconverted). Please notices that the y is inverted because of the VNRectangleObservation coordinates system.
let convertedTopLeft: CGPoint = cameraLayer.layerPointConverted(fromCaptureDevicePoint: CGPoint(x: rectangle.topLeft.x, y: 1 - rectangle.topLeft.y))
let convertedTopRight: CGPoint = cameraLayer.layerPointConverted(fromCaptureDevicePoint: CGPoint(x: rectangle.topRight.x, y: 1 - rectangle.topRight.y))
let convertedBottomLeft: CGPoint = cameraLayer.layerPointConverted(fromCaptureDevicePoint: CGPoint(x: rectangle.bottomLeft.x, y: 1 - rectangle.bottomLeft.y))
let convertedBottomRight: CGPoint = cameraLayer.layerPointConverted(fromCaptureDevicePoint: CGPoint(x: rectangle.bottomRight.x, y: 1 - rectangle.bottomRight.y))
Hope it helped

How to draw gradient with SKKeyframeSequence: as per Apple docs

The Apple docs on SKKeyframeSequence have brief sample code designed to create a gradient:
let colorSequence = SKKeyframeSequence(keyframeValues: [SKColor.green,
SKColor.yellow,
SKColor.red,
SKColor.blue],
times: [0, 0.25, 0.5, 1])
colorSequence.interpolationMode = .linear
stride(from: 0, to: 1, by: 0.001).forEach {
let color = colorSequence.sample(atTime: CGFloat($0)) as! SKColor
}
When combined with a drawing system of some sort, this is said to output this:
How can this be drawn from the sampling of the sequence of colours in the demo code?
ps I don't have any clue how to draw this with SpriteKit objects, hence the absence of attempted code. I'm not asking for code, just an answer on how to use this 'array' of colours to create a gradient that can be used as a texture in SpriteKit.
The colors are different for some reason, but here is what I came up with using their source code:
PG setup:
import SpriteKit
import PlaygroundSupport
let sceneView = SKView(frame: CGRect(origin: CGPoint.zero, size: CGSize(width: 1000, height: 450)))
let scene = SKScene(size: CGSize(width: 1000, height: 450))
LOADSCENE: do {
scene.backgroundColor = .white
scene.anchorPoint = CGPoint(x: 0, y: 0.5)
scene.physicsWorld.gravity = CGVector.zero
sceneView.presentScene(scene)
PlaygroundPage.current.liveView = sceneView
}
Solution:
// Utility func:
func drawLine(from point1: CGPoint, to point2: CGPoint, color: SKColor) {
let linePath = CGMutablePath()
linePath.move(to: point1)
linePath.addLine(to: point2)
let newLine = SKShapeNode(path: linePath)
newLine.strokeColor = color
newLine.lineWidth = 1
newLine.zPosition = 10
scene.addChild(newLine)
newLine.position.x = point1.x
}
// Holds our soon-to-be-generated colors:
var colors = [SKColor]()
LOADCOLORS: do {
let colorSequence = SKKeyframeSequence(keyframeValues: [SKColor.green,
SKColor.yellow,
SKColor.red,
SKColor.blue],
times: [0, 0.25, 0.5, 1])
colorSequence.interpolationMode = .linear
stride(from: 0, to: 1, by: 0.001).forEach {
colors.append(colorSequence.sample(atTime: CGFloat($0)) as! SKColor)
}
}
DRAWGRAD: do {
for i in 1...999 {
let p1 = CGPoint(x: CGFloat(i), y: scene.frame.minY)
let p2 = CGPoint(x: CGFloat(i), y: scene.frame.maxY)
drawLine(from: p1, to: p2, color: colors[i])
}
print("Give me my 25 cookie points, please and TY")
}
You should then be able to get this as a texture as such:
let texture = sceneView.texture(from: scene)
Rendering this took about a million years to render on my gen2 i5 at 2.6ghz for some reason. Will have to look into that, unless it was just a PG bug...

Resources