MapKit (iOS) -- Live MKAnnotation Motion - ios

I'm designing an app using Swift for a food truck. I want to use MapKit to show a pin on the live location of the truck so one can look at the app to see it moving in realtime. I was able to get the server-side element set up using PHP/MySQL. Currently, the app makes a HTTP request every 3 seconds.
How should I go about animating the moving pin (or other image)? I tried a few methods already (if you have others, please suggest!):
I removed the pin and quickly added a new one in my HTTP function. While this works, I would prefer a smooth animation, not a flashing, jerky pin.
I subclassed MKMapViewDelgate like with a didAddAnnotationViews delegate that animates the pin's frame using UIView.animateWithDuration as suggested here. Animation only occurred when the map was initially loaded. Also, I have no idea how this would work with coordinates, since it involves frames.
I subclassed MKAnnotation with a modifiable var coordinate. In my HTTP function, I changed the coordinate. EDIT: This now works since I properly refreshed the mapView (thanks #Paulw11). However, I still have two main issues:
There's no linear animation, which I desire to better simulate real-time movement. How would I go about animating the pin's coordinates? Should I use UIView, like in method 2, a fast NSTimer, or something else?
Due to using the setCenterCoordinate function to forcibly refresh the map, the map cancels any current touches. How do I detect if there's any touches on the MapView so I can prevent the forced update?

Related

How do I set fire to a UIView in Swift?

How do I produce an animation that simulates the burning effect of fire consuming an UIView from top to bottom in Swift?
I found Fireworks, an app that allows users to tweak and try out different settings of CAEmitterLayer with instant results. The effects are great for applying to a whole screen but how would I use it for my purpose - where the UIView must disappear as the fire consumes it from one end to the other?
Is there some tutorial on consuming UIViews with fire using the particle emitter anywhere? I know that I’m supposed to show some code but anything I put here would be irrelevant. I’ve also exhausted my search engine looking for something similar. That’s how I found the Fireworks app actually.
This seems to be a use case that shouldn't be uncommon.
I haven't done much with CAEmitterLayer, so I decided to try my hand at this.
I created a project that does this an posted it on Github. It uses the approach in this Youtube video as a starting point. You can download it here:
FireEmitter project
Here is a small thumbnail of what it looks like:
The project includes a custom subclass of UIView called BurnItDownView
The BurnItDownView is meant to contain other views.
It has one public method, burnItDown(). That triggers the animation.
There are multiple parts to the animation:
A CAEmitterLayer set up to simulate flames burning off a flat surface:
An animation that lowers the emitter layer from the top of the view to the bottom,
A CAGradientView applied as a mask to the view that starts ot fully opaque (with colors of [.clear, .white, .white] and locations of [-0.5, 0, 1] (where the clear color is above the top of the view) and animates the locations property of the gradient view to mask away the view contents from top to bottom. (Animating the locations property to [0, 0, 0], so the entire gradient layer is filled with clear color, fully masking the view's layer.)
Once the view is fully masked, it starts lowering the "birthRate" of the emitter layer in steps until the birth rate is 0. It then holds this step for 2 seconds until all the flame particles have animated away.
Once the flame is fully "extinguised", it resets the locations array to the original value of [-0.5, 0, 1]. This causes an "implicit animation" so the view animates back from the bottom, but quickly
Finally, it resets the emitter layer and emitter cells back to newly a newly created emitter layer and emitter cell to get it ready for the next pass of the animation. (I couldn't figure out how to restore the emitter back to its original state. It was simpler to just create new ones.) It also invokes an Optional completion handler passed to the burnItDown() method. (The app's view controller uses the closure to re-enable the "Burn it down" button.
I was once in your shoe before and came across this Open source library called particle animations.
I would NOT recommend using the library itself since it's deprecated. But I would recommend referring to its source code to get an idea of how to use CAEmitterLayer and CAEmitterCell to make the looks of a Fire!
As you could see from its readme, it has direct examples of Fire. It also states that even Apple and Facebook uses CAEmitterLayer and CAEmitterCell to produce the effect of a fire.
Feel free to ask for more questions.

Is it possible to render a tableview 'skewed'?

I'm coding in Swift 2.0 for devices running iOS7+.
Is it possible to present a tableview in a skewed/diagonal/slanted format as indicated below?
Obviously if the answer is yes, what process would I need to go through to get the result?
Yes it's possible. Views in iOS have a transform property, of type CGAffineTransform. You can use that to make the view appear skewed. I don't know offhand how to create a transform that creates the skewing effect. I suggest doing some google searching.
The next issue you will face is interacting with taps. Changing the transform of a view does not transform the coordinate system applied to taps, so taps will still land on the non-skewed views. That will be much harder to sort out, and without doing a fair amount of research I don't have an answer for you on that one. (It would probably be possible to intercept touch events before they get to your table view and apply the inverse of your skewing transform to them so that you map the taps back to the rectangular coordinate system the table view is expecting.)

Make a Mapbox RMAnnotation display its callout programmatically

I'm making an iOS app (using Swift) that has a map in the Mapbox iOS SDK. I've gotten to the point of displaying several markers on the map. Now, I want the user to be able to select a marker from the list, panning to that marker (easy), which also makes the marker's callout bubble appear automatically without the user having to touch it (not so easy).
It's this last task I'm having trouble with. While I've found the RMMarker class's showLabel() method, I can't seem to directly access a RMAnnotation's associated RMMarker object, so I'm not sure where or how to call this method.
Does anyone know how this is done?
Ignore the showLabel() API — this is not the callout in use, but rather a text label that's possible directly on the annotation.
You probably want -[RMMapView selectAnnotation:animated:] with a NO in the animated argument.

Detect if user moved map or if it happened programatically iOS Mapkit

Ok so I have a map loaded with pins from a remote JSON feed which is loaded into the app. This all works fine.
Now from initial experimenting regionDidChangeAnimated gets called multiple times and so I moved my post request to a method that uses a drag map gesture recogniser which then performs a post request to get data from a JSON feed and reload my map pins after removing them. This also works perfectly.
Now the only issue I have left is if I select an annotation that is close to the edge of the screen the map moves slightly to accommodate the display of the annotation callout. The problem is regionDidChangeAnimated gets called when this happens however my post request doesn't as I call it using the gesture recogniser and so the map was not moved by user intervention but the OS. I don't want to move my Post request to the regionDidChangeAnimated as this gets called multiple times but I would like to know if there is a way to do a test if user caused map to move or it did it its-self to accommodate the callout as explained above. The regionDidChangeAnimated from the research I have looked at may get called more times than necessary so some guidance would be good on how to prevent that or detect user interaction vs OS moving the map.
I have a similar problem: I want do distinguish if the displayed part of a map has been changed by program or by user interaction. Apparently, MKMapView objects do not tell me so, i.e. regionDidChangeAnimated is called in both cases without an indication why.
But since MKMapView is a subclass of UIView, and this one of UIResponder, one can implement the methods touchesBegan:withEvent: and touchesEnded:withEvent: to find out, if the map has been touched during the change of the displayed region.
If so, one can assume that the change in the displayed region was caused by user interaction. Of course you can be more specific if you investigate the set of touches and the event type of these methods more precisely.
I am such a plank sometimes writing problems can help. I simply removed the regionDidChangeAnimated as I have no need for it and the code that was present there I moved to my gesture which was to removeAnnotations before re-adding them doh!

Animate coordinate changes to annotations on MKMapView

Is it possible to animate changes to the coordinates of an annotation on an MKMapView in iOS? Nothing in the API seems to indicate that this is possible, however apps like Uber seem to do this when showing car locations in their map. Perhaps they are calling setCoordinate: multiple times over a short time to give the appearance of animation taking place?

Resources