I have a view controller in which a user can move around UIButton, UISlider, and a custom UIView based control by panning the control around the screen. This view controller is used to create custom layout of control by the user. This is all done by adding PanGestureRecognizer to the UIControl to move the position of the control relative to user's finger location.
let panRecognizer = UIPanGestureRecognizer(target: self, action: #selector(pan))
panRecognizer.delegate = self;
myslider.addGestureRecognizer(panRecognizer)
~~~~~~~~~~~~
//pan handler method
#objc func pan(_ gestureRecognizer: UIPanGestureRecognizer) {
let translation = gestureRecognizer.translation(in: view)
guard let gestureView = gestureRecognizer.view else {
return
}
//Move the center to user's finger location
gestureView.center = CGPoint(x: gestureView.center.x + translation.x, y: gestureView.center.y + translation.y);
gestureRecognizer.setTranslation(.zero, in: view);
if (gestureRecognizer.state == .ended) {
//Save the new location inside data. Not relevant here.
}
}
This worked fine in iOS 13 and below for all the control i mentioned above (with the UISlider being a bit glitchy but it still responded to the pan gesture and i don't need the value of the uislider anyway so it's safe to ignore). However testing the app in iOS 14 reveals that UISlider completely ignore the PanGesture (proven by adding breakpoint that never got called inside the pan gesture handling method).
I have looked at apple's documentation regarding UISlider and found no change at all related to gesture handling so this must be done deliberately in deeper lever. My question is: is there any way to "force" my custom gesture to be executed instead of UISlider's gesture without the need to create transparent button overlay (which i don't know will work or not) / creating dummy slider just for this ?
Additionally i also added UILongPressGestureRecognizer and UITapGestureRecognizer to the control. Which worked fine on other UIButton but completely ignored by the UISlider in iOS14 (in iOS 13 everything worked fine).
Thanks.
Okay.. I found the answer myself after some more digging and getting cue from "claude31" about making a new clean project and testing from there.
The problem was with the overriden beginTracking function. This UISlider of mine is actually subclassed into a custom class and in there the beginTracking function is overriden, as per code below:
override func beginTracking(_ touch: UITouch, with event: UIEvent?) -> Bool {
let percent = Float(touch.location(in: self).x / bounds.size.width)
let delta = percent * (maximumValue - minimumValue)
let newValue = minimumValue + delta
self.setValue(newValue, animated: false)
super.sendActions(for: UIControl.Event.valueChanged)
return true
}
This is to make the slider move immediately to the user finger location if the touch is inside the boundaries of the UISlider (without the need to first touch the thumbTrack and sliding it to the position the user wants).
In iOS 13 this function does not block the gestureRecognizer from getting recognized. However in iOS 14B4, overriding this function, with sendActions(for:) method inside it cause the added gestureRecognizer to be ignored completely, be it pan gesture, long press gesture, or even tap gesture.
For me, the solution is to simply add a state to check whether the pan gesture is required in this view controller or not. Because, luckily, i only need the added gestureRecognizer in view controller that does not require the beginTouch function to be customized and vice versa.
Edit:
Originally i wrote that the cause of the problem is due to always true return value, however, I just reread the documentation and the default return value for this function is also true. So i think the root causes of this problem is the sendActions(for:) method, causing the added gestureRecognizer to be ignored. My Answer above has been edited to reflect this.
Related
I have observed MKMapView zoom-out on Double tap gesture, I didn't find any way to disable it. I tried to add own Double Tap Gesture to catch Double tap action but it still zoom out. Any thought?
There is no API to disable or change double-tap behavior with MKMapView.
But, if you really want to do so, one approach would be to find and remove the double-tap gesture recognizer from the MKMapView object.
In the project you shared, you could do that in makeUIView in your UIMapView class:
func makeUIView(context: UIViewRepresentableContext<UIMapView>) -> UIViewType {
self.configureView(mapView, context: context)
setRegion(to: palce)
if let v = mapView.subviews.first,
let ga1 = v.gestureRecognizers
{
let ga2: [UITapGestureRecognizer] = ga1.compactMap { $0 as? UITapGestureRecognizer } .filter { ($0.numberOfTapsRequired == 2) }
for g in ga2 {
v.removeGestureRecognizer(g)
}
}
return mapView
}
I wouldn't necessarily suggest that you do so, however.
Apple may change the MKMapView object in the future, which could then break this.
User's tend to prefer that common UI elements behave in expected ways.
Personally, I get rather annoyed when using an app and the developer has changed standard functionality of UI elements. For example, if I see a table view row with a disclosure indicator (the right-arrow / chevron), I expect that tapping the row will "push" to another screen related to that row. I've seen apps that do not follow that pattern, and it just gets confusing.
I have a google maps in which I want it to recognise long press gestures on markers. For the mean time I just want to map to respond to a long press gesture. I think I have followed the steps to implementing it correctly but don't seem to be getting a response from the long press and not sure why- could it be because I am testing on simulator that it doesn't recognise click as long press? Anyway here is the method I used below so if anybody can see if there is anything I've missed, please let me know.
1.) dragged long tap gesture recogniser from object library into my maps view in the main storyboard.
2.) this put the gestureRecognizers --> map View into my referencing outlet connection
3.) set the min duration to 0.5sec and enabled both touch and tap recognizers to 1.
4.) in the viewcontroller which contains my mapView typed in:
#IBAction func handleLongtap(recognizer: UILongPressGestureRecognizer) {
print("PRESSED")
}
5.) then back on the main storyboard did control drag Long Press Gesture recognizer to the view controller and selected "handleLongtap:" which put 'handleLongTap --> longPressGesture" into my view controller 'received actions'
Despite there being no error- i do not get "PRESSED" in my terminal when i do a long click on simulator. Any idea what is going wrong?
You could try adding the gesture recognizer in code:
let longPress = UILongPressGestureRecognizer()
longPress.addTarget(self, action: ViewController.handleTap)
yourPin.addGestureRecognizer(longPress)
func handleTap() {
print("Tapped")
}
And just tweak it as you need.
I have controllerA, I add some UIView subclass to it .
This view has a long press gesture inside it, so when you long press it, it delegate to controllerA about it.
Now, while user is long pressing that view, I want to start dragging it (in controllerA) by panGesture that I'v added to controllerA .
The only problem is, that the current long press on that view( inside its class), is eliminating the pan gesture from controllerA.( I can drag it only when I remove my finger)
Doing :
view.userInteractionEnable=false
after the long press started- will not cancel touches and let me drag, and I can't find a way to cancel the current gesture unless I remove my finger.
If I understand you correctly, use this function and return needed state for your recognizers:
func gestureRecognizer(gestureRecognizer: UIGestureRecognizer, shouldRecognizeSimultaneouslyWithGestureRecognizer otherGestureRecognizer: UIGestureRecognizer) -> Bool {
return true
}
I found out a much simpler strategy .
If you want to start dragging an element while long press, you do not need to use pan gesture .
You can simply set long press and set allowableMovement to something big, then you translate the movement to your view's axis :
let long:UILongPressGestureRecognizer = UILongPressGestureRecognizer(target: self, action: #selector(self.long))
long.allowableMovement=3000
view.addGestureRecognizer(long)
Then you translate the coordinates with :
let point = obj.location(in: view.superview)
if(action == "began"){draggedPoint = point} //save starting point
if(action == "changed")
{
view.frame.origin.x+=point.x-draggedPoint.x //translate to our view and drag
view.frame.origin.y+=point.y-draggedPoint.y
isDragging=true
}
So because the movement is the distance from the view's left corner on start, you simply save this point when we begin pressing, and then measure the distance, and simply move your view .
Works great .
I've seen a similar question here but with no working solution.
I have a UIScrollView that is embedded in a UIPageViewController. I would like to have the user able to scroll the UIScrollView normally. Then, if one end of the UIScrollView is reached (all the way to the left, for example) then swiping left (swiping right, but you get what I mean) would trigger the UIPageViewController to shift left to another page.
The default behavior is a bit sketchy. It seems like the gesture recognizers compete and in some cases the UIScrollView does scroll, and in other cases the PageView just shifts to the next page. I'd like to be able to tie this behavior to the content offset of the UIScrollView, or something.
I'll be trying some experiments with the requireFail methods on the various gesture recognizers, but it seems like all of the PageView's gesture recognizer properties are nil'd out because I'm using the scroll transition type. So...walking the hierarchy?
Anyway, I'd appreciate it if anyone knows an easy solution to this.
The way I ended up solving this was by using a replacement for UIPageViewController: EMPageViewController. Because I had direct access to the UIScrollView, I was able to subclass it.
The solution basically came in subclassing and adding toggle flags to the UIScrollViews (the one FOR the EMPageViewController and the one INSIDE it) to control when they should be active or not, based on various factors (such as the content offset of one of the scroll views). Then I was able to override gestureRecognizerShouldBegin:gestureRecognizer in different ways in the UIScrollView subclasses, checking the toggle flags and the velocity of the panning gesture recognizer.
Something like this (code out of context):
import UIKit
class CustomScrollView: UIScrollView {
// these toggles get set by some senior ViewController
var lockDown = false
var lockUp = false
override func gestureRecognizerShouldBegin(gestureRecognizer: UIGestureRecognizer) -> Bool {
print("= gesture starting")
if let panner = gestureRecognizer as? UIPanGestureRecognizer {
if panner.velocityInView(self).y < 0.0 {
print("== pulling down, lockDown: \(lockDown)")
if lockDown {
print("=== cancelling paging scroll")
return false
}
}
if panner.velocityInView(self).y > 0.0 {
print("== pulling up, lockUp: \(lockUp)")
if lockUp {
print("=== cancelling paging scroll")
return false
}
}
}
return super.gestureRecognizerShouldBegin(gestureRecognizer)
}
}
Anyway, I did get it all working slickly in the end. Everyone would probably have a slightly different use case, but the bottom line is that you need to subclass UIScrollView and so UIPageViewController is not an option.
So far I have a grid of buttons and have attached a pan gesture recognizer to the view. I can track the events and get the location of the finger as it moves but there doesn't seem to be the equivalent of a "mouseEnter" message to use to get info about or control properties (such as the highlighting) of the other buttons I pass over.
Am I missing something? How can I accomplish, say, highlighting the buttons under the users' fingers as they pan over them? Does cocoa touch support this or must something else be done?
Thanks.
You are right, there is no such event. Also UIButton events won't help you either, because those require to actually start gesture inside. What you can do instead is to get location of the point you are currently dragging:
func panDetected(sender : MoreInformativeGestureRecognizer) {
let touchPoint = sender.locationInView(self.view)
}
And now, when you have the point, you can iterate on all the buttons you have and check if the point is inside the button:
let buttons = [UIButton]
let lastActiveButton = UIButton?
...
// Iterate through all the buttons
for button in buttons {
// Check area of the button against your touch
if CGRectContainsPoint(button.frame, touchPoint) {
// You are inside the touch area of the button
// Here, you can for example perform some action if you want, store information
// about the button so you don't do it multiple times etc.. your call :)
self.lastActiveButton = button
}
}
This way you can detect then you go in and out and do whatever you want with events. Hope it helps!
UIButton inherits from UIControl which has a number of events
Have you tried that? You can add a listener to those events, either through nib/storyboard or through code (look under discussion to see how to do this)