How can I set up gesture recognizer to interact any UIView when all the views are being animated? - uiview

I found the code listed below from https://github.com/DuncanMC/iOS-CAAnimation-group-demo. This particular method allows the user to stop a UIView while it is "in-flight" in a core animation sequence using a gesture recognizer. When the view is tapped, the animation stops. As shown this code will only work on animated view. I have many animated views and I need interaction with any of the views. I think I must set up an array of views (or layers) and cycle through them. Is this correct? How could I do this? Thanks!
/*
This method gets called from a tap gesture recognizer installed on the view myContainerView.
We get the coordinates of the tap from the gesture recognizer and use it to hit-test
myContainerView.layer.presentationLayer to see if the user tapped on the moving image view's
(presentation) layer. The presentation layer's properties are updated as the animation runs, so hit-testing
the presentation layer lets you do tap and/or collision tests on the "in flight" animation.
*/
- (IBAction)testViewTapped:(id)sender
{
CALayer *tappedLayer;
id layerDelegate;
UITapGestureRecognizer *theTapper = (UITapGestureRecognizer *)sender;
CGPoint touchPoint = [theTapper locationInView: myContainerView];
if (animationInFlight)
{
tappedLayer = [myContainerView.layer.presentationLayer hitTest: touchPoint];
layerDelegate = [tappedLayer delegate];
if (((layerDelegate == imageOne && !doingMaskAnimation)) ||
(layerDelegate == waretoLogoLarge && doingMaskAnimation))
{
if (myContainerView.layer.speed == 0)
[self resumeLayer: myContainerView.layer];
else
{
[self pauseLayer: myContainerView.layer];
//Also kill all the pending label changes that we set up using performSelector:withObject:afterDelay
[NSObject cancelPreviousPerformRequestsWithTarget: animationStepLabel];
}
}
}
}

LOL. That demo project is mine. The code is written to find the layer that was tapped, and then use the fact that for a layer that backs a UIView, the layer's delegate is the view itself.
At the point in the code where it finds the layerDelegate, you should make sure it isKindOfClass UIView, then use whatever method is appropriate to match your view with the views you've animated. You could use an IBOutletCollection to keep track of the views that you are animating, or manually create a mutable array and add view objects to it, use view tags, or whatever makes sense for your application.

Related

Detecting UIView grid squares where a UIPanGestureRecognizer begins and ends

I'm designing a game that takes place on a grid. The grid needs to respond to pan gestures that begin in one square and end in another. I have a custom view controller class called GameVC which contains the grid of UIViews of subclass GameGridSquare. I want the game to perform an action when, for example, a pan gesture begins and ends in neighboring squares. I have a storyboard wired with properties that name each square by row and column: self.A1, self.A2, ... self.H7, self.H8. As a pan gesture is recognized, I want the GameVC to receive the two GameGridSquares where UIGestureRecognizerStateBegan and UIGestureRecognizerStateEnded so it can determine the appropriate action, like so:
-(void)validatePanGestureFrom:(GameGridSquare*)beginSquare
to:(GameGridSquare*)endSquare
What's the best way to do this? My recognizers are functional and returning coordinates, but I think I need to explore hit-testing.
If I add recognizers to the individual GameGridSquares, I get undesired results. For example, a pan that begins in A1 and ends in B2 would be recognized by A1 alone. This suggests that I need a pan recognizer on GameVC that can detect when the gesture begins and ends in separate subviews.
From what I've read, I believe that the UIGestureRecognizerDelegate protocol may be helpful here. I also understand that a custom UIPanGestureRecognizer subclass would allow me to override hitTest:withEvent but I'm not sure where to even begin with that. Any ideas about how I should approach this?
This was what I was after:
Find which child view was tapped when using UITapGestureRecognizer
UIView* view = gestureRecognizer.view;
CGPoint loc = [gestureRecognizer locationInView:view];
UIView* subview = [view hitTest:loc withEvent:nil];

UITapGestureRecognizer on UIView and Its Subview Respond Together When Subview Is Tapped

UITapGestureRecognizer is applied to both UIImageView and its subview (UITextView). However, when I tap on subview, the receiver becomes subview and its parent view (i.e. UIImageView + UITextView). It should however be only subview because that was the one I tapped. I was assuming nested gestures would react first but apparently parent receives the fist tap and then it goes to child.
So, there are different solutions out there for various scenarios (not similar to mine but rather buttons inside scroll view conflict). How can I easily fix my issue without possible subclassing and for iOS 6+ support? I tried delaying touch on start for UIGestureRecognizer on UIImageView and I tried setting cancelsTouchesInView to NO - all with no luck.
Try the following code:
conform the <UIGestureRecognizerDelegate> to your class.
set yourGesture.delegate = self;
then add this delegate Method:
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch {
// return YES (the default) to allow the gesture recognizer to examine the touch object, NO to prevent the gesture recognizer from seeing this touch object.
if([touch.view isKindOfClass: [UITextView class]] == YES)] {
return YES;
}
else {
return NO;
}
}
Hope it will solve your issue. Enjoy Coding..!!!!
That's exactly what is it supposed to do.
View hierarchy is like a tree structure and its traversal during a touch gesture starts from the root node. It is very likely for your parent view to receive gesture first and then its subviews. The traversal skips the nodes for which
userInteractionEnabled = NO.
since, you don't have any code I can't help you to play with this flag. A more general solution is to always set gesture only for your parentView and in the gesture delegates check the coordinates if it belongs to any one of the subview and if yes then call your gesture method for your subview.
Not a clean approach but works. !!
you should implement the UIGestureRecognizer delegate methods and apply the correct policy to the gesture, when multiple gesture are recognized

Set exclusive touch on multiple UIViews of the same class

I am creating a random number of custom UIViews of the same class, and I'm adding them in the UIViewController's view. I'm assigning them a UITapGestureRecognizer, but I can't seem to make the exclusive touch work:
for (int i = 0; i <= n; i++) {
ICCatalogProductView *catalogProductView;
catalogProductView = [[ICCatalogProductView alloc] init];
[self.view addSubview:catalogProductView]
UITapGestureRecognizer *tapGesture = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(testTouch)];
[catalogProductView addGestureRecognizer:tapGesture];
[catalogProductView setExclusiveTouch:YES];
}
If i tap the UIViews simultanously, the method is called twice (not the behaviour I want). Is there any elegant method of solving this, or any method at all?
From the Apple Documentation:
exclusiveTouch only prevents touches in other views during the time in
which there's an active touch in the exclusive touch view. That is, if
you put a finger down in an exclusive touch view touches won't start
in other views until you lift the first finger. It does not prevent
touches from starting in other views if there are currently no touches
in the exclusiveTouch view.
To truly make this view the only thing on screen that can receive
touches you'd need to either add another view over top of everything
else to catch the rest of the touches, or subclass a view somewhere in
your hierarchy (or your UIWindow itself) and override
hitTest:withEvent: to always return your text view when it's visible,
or to return nil for touches not in your text view.
means its only set exclusive in your one view, not if you are touching something outside your view.

How do I make a MKAnnotationView touch sensitive?

I currently have a map view with some annotation on it. I have the annotation with custom images. The problem I am trying to fix is the sensitivity of the images. When I try to drag them, it feels like I have to touch the exact center for it to be focused on. Is there a way to make the touch bounds bigger?
To do this, you need to subclass MKAnnotationView to create your own custom MKAnnotationView. In your subclass, override the following functions:
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent*)event
{
// Return YES if the point is inside an area you want to be touchable
}
- (UIView*)hitTest:(CGPoint)point withEvent:(UIEvent*)event
{
// Return the deepest view that the point is inside of.
}
This allows interactive views (such as buttons, etc.) to be pressed. The default implementation in MKAnnotationView is not strict on pointInside and hitTest because it allows presses that are actually inside one annotation to get sent to a different annotation. It does this by figuring out the closest annotation center to the touch-point and sending the events to that annotation, this is so that close-together (overlapping) annotations don't block each other from being selected. However, in your case you probably want to block other annotations if the user is to select and drag the topmost annotation, so the above method is probably what you want, or else it will set you on the right path.
EDIT:
I was asked in comments for an example implementation of hitTest:withEvent: - This depends on exactly what you are trying to achieve. The original question suggested touching and dragging images within the annotation whereas in my case I have some buttons inside the annotation that I want to be interactive. Here's my code:
- (UIView*)hitTest:(CGPoint)point withEvent:(UIEvent*)event
{
UIView* hitView = [super hitTest:point withEvent:event];
if (hitView != nil)
{
// ensure that the callout appears above all other views
[self.superview bringSubviewToFront:self];
// we tapped inside the callout
if (CGRectContainsPoint(self.resultView.callButton.frame, point))
{
hitView = self.resultView.callButton;
}
else if (CGRectContainsPoint(self.resultView.addButton.frame, point))
{
hitView = self.resultView.addButton;
}
else
{
hitView = self.resultView.viewDetailsButton;
}
[self preventSelectionChange];
}
return hitView;
}
As you can see it's quite simple - The MKAnnotationView implementation (called as super on the first line of my implementation) only returns the first (outermost) view, it does not drill down through the view hierarchy to see which sub-view the touch is actually inside. In my case I just check if the touch is inside one of three buttons and return those. In other circumstances you may have simple rectangle-based drilling down through the hierarchy or more complex hit tests for example to avoid transparent areas within your view to allow touches to pass through those parts. If you do need to drill down, CGRectContainsPoint can be used the same way I have used it, but remember to offset your points into local view coordinates for each view-level you drill into.
The preventSelectionChange method is to prevent my custom annotation from becoming selected, I am using it as a customisable/interactive callout from map pins and this keeps the pin it relates to selected instead of allowing the selection to change to this annotation.
Did you subclass MKAnnotationView or did you change the just set the image property of it?
Here's the documentation for setting the image property.
Discussion:
Assigning a new image to this property also changes the size of the view’s frame so that it matches the width and height of the new image. The position of the view’s frame does not change.
Check the size of the frame of your annotation view, which is where your object can receive touches.
I implemented something similar in the following manner
Create a subclass of the Gesture Recognizer class that handles touches
Create a subclass of the UIImage class, this class uses the recognizer class to handle your gestures
Use this in the annotation
The gesture recognizer subclass will handle your gestures if you perform them at any point in the image. This should help you.
Keep us updated on whether this solution worked for u...
#jhabbott solution never worked for me, as I mentioned here.
I have an image and a label side by side. The image was shown by setting annotationview image property, and the label by adding an UILabel
I redirected the func point(inside:with:) method to the UILabel one (which included the image zone) and hitTest did return exactly the same view whether I clicked on the label or the image. But label click did not produce any callback...
Finally, I ended up by enlarging the MKAnnotationView frame to enclose label + image, I set annotationView.image to nil, and I created my custom UIImageView.
Because I wanted the anchor point at the middle of the image, I had to set a custom one :
self.frame = CGRect(x: 0, y: 0, width:
self.myLabel.frame.width, height: self.myLabel.frame.height)
self.centerOffset = CGPoint(x: self.frame.width/2, y: self.myImageView.frame.height/2)
Then I deleted point(inside:with:) and hitTest(point:with:) overrides that did nothing.
And now, for the first time, my annotation view is completely reactive.

Shouldn't UIRecognizers fire when a subview is touched?

Views in my iPad app behave as if they prevent their superview's gesture recognizers from firing when the user initiates such gesture in that view.
Is this expected?
How can I remove that shielding behavior?
What are good practices to debug gesture recognizers?
In more details:
The main "canvas" view of my application, lets the user adds shapes to it with a "long double tap". I attached a gesture recognizer for such gestures to the main view. That works very well: the main view gets called, and reacts by adding a shape to the main view.
Shapes are implemented as subviews of the main view. When the user long-double-taps in the main view, my code instanciate a shape subview, and adds it to the main view. Shape views can be moved around with a long-single-tap recognizer. So I also attach a gesture recognizer for long-single-taps to every shape view. That works very well: the shape view gets called and lets the user move it in the canvas.
However, when the user long-double-taps in a shape view, nothing happens: the shape view is not called, which is expected since it doesn't have a gesture recognizer for long-double-taps. But the main view is not called either. I had thought that since the gesture was not recognized by the shape view, then it would be propagated up in the responder chain to the main view. But this doesn't happen.
My intent is to let the user add overlapping shapes to the main view, so that a long-double-tap on a shape would also add a new shape to the main view.
What could I have missed?
I can of course add a long-double-tap recognizer to shape views, and from there, either forward the gesture to the main view or handle the gesture directly in a way similar to what I do in the main view.
But this sounds wasteful, and more importantly, I'd like to understand the behavior.
Thanks for any suggestion.
It should as far as I can see pass the message along out of the box.
To ensure both gestureRecognizers are not fired you need to do something like:
[longPress requireGestureRecognizerToFail:doubleLongPress];
Update
Just free styling here but if you want to limit the gesture to one view you could try playing with the gesture delegate (this will only respond if the touched view is self.view)
self.myGesture.delegate = self;
In your controller do something like:
//.h
#interface MyController : UIViewController <UIGestureRecognizerDelegate>
// ...
#end
//.m
#implementation MyController
//...
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch;
{
BOOL shouldReceiveTouch = YES;
if (gestureRecognizer == self.myGesture) {
shouldReceiveTouch = (touch.view == self.view);
}
return shouldReceiveTouch;
}
//...
#end
NB I haven't tested this but I will update when I test it later.

Resources